I have an enormous file of 1.9 GB (more than 650000 records of Host's BD) that I have to import in an Oracle BD and realize calculations with its fields.
I have tried to do it with two methods:
1.-Using utl_file to work directly with the file
2.-Using sql*loader to import the file to a table and work from this table.
The two works(the second one a bit more rapid than the first one) but turn out to be very slow (1000 policies per minute) being able to be late the whole process more than 10 hours!
How I can improve the yield of the process?
I attach the procedure of the first method and the trigger of the second one
Please post your .ctl file for SQL*loader. Any way you got single 2GB file. USe direct path use skip indexes option . I guess if it is taking 10 hours means you are using conventional path method and you may have several triggers and the table.
It's not the load itself that is slowing you up, it's the per-row processing method you are using. Switching to direct load will not help you at the moment because it will not fire the trigger.
Get rid of the trigger, load the data into your table, then process it in bulk. That is the way that databases run best.
Originally posted by balajiyes hi
i don't hv any clue of ur business requirement.
when u IMPORT use DIRECT=Y this is faster then the conventional method