DBAsupport.com Forums - Powered by vBulletin
Page 2 of 2 FirstFirst 12
Results 11 to 12 of 12

Thread: problems with big table

  1. #11
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    You need to create multiple data files for the tablespace.
    Do not use autoextend feature. That will slow down insert process.

    If you use 10g or 9i, do not use AUTO SEGMENT SPACE MANAGEMENT. In my test, that gave worst performance for a large load ( 20 million rows insert by 20 threads). There are some bugs reported while using ASSM.

    Use LMT with uniform extent size with MANUAL SEGMENT SPACE MANAAGEMENT.

    Also, increase freelists for the table. Start with 7 and slowly increase it to 13 depending upon the concurrent inserts on the table.

    Tamil

  2. #12
    Join Date
    Mar 2002
    Posts
    534
    Hi,

    If the "old methode" shouldn't work for any reason you may instead to delete 3M rows create a new table via CTS excluding this 3M rows. It may be faster to create a table with 12 x 3M rows (12 in case you keep your data during 1 year) then to delete 3M rows.

    good luck
    Mike

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width