I have an enormous file of 1.9 GB (more than 650000 records of Host's BD) that I have to import in an Oracle BD and realize calculations with its fields.
I have tried to do it with two methods:
1.-Using utl_file to work directly with the file
2.-Using sql*loader to import the file to a table and work from this table.
The two works(the second one a bit more rapid than the first one) but turn out to be very slow (1000 policies per minute) being able to be late the whole process more than 10 hours!
How I can improve the yield of the process?
I attach the procedure of the first method and the trigger of the second one
Please post your .ctl file for SQL*loader. Any way you got single 2GB file. USe direct path use skip indexes option . I guess if it is taking 10 hours means you are using conventional path method and you may have several triggers and the table.