Large Table Update
Suppose we have a table with 80 Million rows. Every day, I need to do a one-time update of 20 million rows - from a flat file. The table is range partitioned. And it has to be available for OLTP during the update. What's the best way to do it?
This is a proposed situation for a new application coming in. The vendor is doing the testing to see how fast it can be done. I would do some testing, but I don't have a server with Partitioning on it right now.
Loading using single update statements didn't work, obviously. The latest they tried was splitting the file into multiple files, each with 30K rows. Then using SQL*Loader to load each into a temp table and use a cursor to update the rows. They ran 100 processes at one time, and it took about 2 hours. We're still looking at other things.
But does anybody have any good ideas?
use the flat file as an external table and update your table from there. me thinks..
Oracle it's not just a database it's a lifestyle!
BTW....You need to get a girlfriend who's last name isn't .jpg
The SQL*Load into the temporary tables is VERY quick.. It's the update that takes so long. How would using External Tables help me? I've never used them, so I'm curious.
I'm also looking at using an inline select for the update...
Any other ideas?
Click Here to Expand Forum to Full Width