-
Incremental Commit on Large Insert
Hello everyone, hope you can help me with this. My team and I are trying to insert about 150 million records from a query using 4 tables into 1 table. Due to hardware limitations, we need to commit ever n number of records. The best solution we have found so far is a loop that has an iteration counter, and commits when the counter is divisible by a certain number. However, that involves inserting rows 1 at a time, and slows down the insert dramatically. A technical expert has recommended using arrays to insert records in bulk, but we cannot find any information on that. Any help would be greatly appreciated. Thanks.
-
use bulk collect / forall i .. to speedup the process.,
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|