inserting large amounts of data Forums - Powered by vBulletin
Results 1 to 4 of 4

Thread: inserting large amounts of data

  1. #1
    Join Date
    Apr 2001
    Hi - I have a table with 46 million rows in it and I need to transfer that data to two other tables only my insert statements will need to link to 3 other tables (3.7 mill rows, 3.7 mill rows, 353K rows and 1.7K rows). I'm trying to think of an effecient way to do this (wont take days). Does anyone have any ideas?

  2. #2
    Join Date
    Apr 2001
    Louisville KY
    I don't know if it will take days or not (that tends to be machine dependent. If your have a Clydesdale or Percheron, it might takes hours. If you have Shetland, it might take weeks.)

    Anyway, create a view of the four tables with the data to be inserted, then use the 'Copy' command to create the target table. By using a view, you can add the differentiation between the two targets and run the 'Copy' in parallell with itself. By setting buffer size appropriately, you won't run into rollback problems because of commits.
    Joseph R.P. Maloney, CSP,CDP,CCP
    'The answer is 42'

  3. #3
    Join Date
    Aug 2001
    You can also use the insert into select * from nologging or unrecoverable depending on the version of oracle.
    The nologging or unrecoverable will not generate redo...


  4. #4
    Join Date
    Apr 2001
    Direct-load INSERT will help you.

    Basic steps might look like this:
    * Drop all indexes and triggers on the target tables
    * Set target tables NOLOGGING
    * INSERT /*+ APPEND */ INTO taget_table SELECT ... FROM ... (You can use PARALLEL here)
    * Create indexes NOLOGGING
    * Set table and indexes LOGGING, create triggers
    * Backup database

    Detailed info:
    The whole difference between a little boy and an adult man is the price of toys

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Click Here to Expand Forum to Full Width