Hi - I have a table with 46 million rows in it and I need to transfer that data to two other tables only my insert statements will need to link to 3 other tables (3.7 mill rows, 3.7 mill rows, 353K rows and 1.7K rows). I'm trying to think of an effecient way to do this (wont take days). Does anyone have any ideas?
I don't know if it will take days or not (that tends to be machine dependent. If your have a Clydesdale or Percheron, it might takes hours. If you have Shetland, it might take weeks.)
Anyway, create a view of the four tables with the data to be inserted, then use the 'Copy' command to create the target table. By using a view, you can add the differentiation between the two targets and run the 'Copy' in parallell with itself. By setting buffer size appropriately, you won't run into rollback problems because of commits.
Joseph R.P. Maloney, CSP,CDP,CCP
'The answer is 42'
Basic steps might look like this:
* Drop all indexes and triggers on the target tables
* Set target tables NOLOGGING
* INSERT /*+ APPEND */ INTO taget_table SELECT ... FROM ... (You can use PARALLEL here)
* Create indexes NOLOGGING
* Set table and indexes LOGGING, create triggers
* Backup database