Home » SQL & PL/SQL » SQL & PL/SQL » Best way to process large volume of data (Oracle9i)
Best way to process large volume of data (Oracle9i) [message #23108] Mon, 18 November 2002 14:37 Go to next message
Andrew G.
Messages: 7
Registered: November 2002
Junior Member
Hi,

I need advice on the method to be used for the following requirement:
- A large flat file (circa 1 million) needs to be imported into an Oracle 9i database, then several queries needs to be run against the existing data which would validate the newly imported data, and then the new data needs to be inserted into production tables, and some other production tables need to be updated (for the records that have been validated).

As far as I know, the most efficient way to do this is to import the data from the flat file into a NOLOGGING table using SQL Loader. After that, the hard part comes - how do you efficiently scan 1 million records, and validate/insert/merge the data into the production tables. The best thing I could think of is to take chunks of 1000-5000 records, insert them into a temporary table and then do the rest of processing. However, I'm not sure how to actually scan the data from the large NOLOGGING table and then delete the data from that table. I could declare a cursor on the whole table and then loop through 1000-5000 rows, insert them into a temporary table, delete them (using DELETE WHERE CURRENT OF) and then process the temporary table. However, I'm not sure if this would take a lot of CPU/memory resources.

Any help would be greatly appreciated,

Regards

Andrew
Re: Best way to process large volume of data (Oracle9i) [message #23110 is a reply to message #23108] Mon, 18 November 2002 15:33 Go to previous message
Andrew G.
Messages: 7
Registered: November 2002
Junior Member
Hi Todd,

Thanks for the suggestion but I still have doubts about using a single transaction. The thing is that before I start inserting the new records (transactions) I have to lock the existing accounts in the database. Using a single transaction would mean that all the accounts would be locked up for as long as the whole processing lasts. Another issue that scares me with the single transaction is what happens if for any reason the transaction fails and the whole thing needs to be rolled back. That would mean that everything would have to be reversed and re-attempted, which may be too long. I don't really have a good idea of how long the whole thing will last but I expect it to be at least an hour and probably quite a bit more.

BTW, how big do you think a rollback segment should be for 1,000,000 rows of a, e.g. 1K row table? Is it simply 1Mil * 1K = 1Bil?

Regards

Andrew
Previous Topic: ALL_OBJECTS view does not list all objects from PL/SQL procedure
Next Topic: Date Question
Goto Forum:
  


Current Time: Mon Apr 29 07:42:07 CDT 2024