DBAsupport.com Forums - Powered by vBulletin
Results 1 to 7 of 7

Thread: Tuning delete

  1. #1
    Join Date
    Nov 2000
    Posts
    48
    Hi Everyone.

    I need urgent help from you guyes.

    I am trying to delete from one of the table and it is expected to delete millions of rows everytime.
    The SQL , I am using is as below.

    delete from xyz where time_key
    in (select time_key from dss_dims_time where period_year = 2000) AND rownum < 10001

    I am trying to delete 10000 rows at a time , in order to avoid blowing up the rollback segment.
    I am thinking of writing a PL/SQL procedure in which a cursor having all rowids matching for the condition and deleting one by one in a loop and checking the count , if count is = 10000 then commit it. Pl let me know your ideas and suggestion as soon as possible.

    Thanks in advance.

    Rajesh

  2. #2
    Join Date
    Jun 2000
    Posts
    417
    Is there another field you could key your deletes off of? For example a time field where you could delete maybe 1 day at a time instead of a whole month, or something of that nature?

    You have the period year, but I suppose if they're all the same it won't help.

    Actually the rownum idea should work ok either way though.

  3. #3
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    Instead of deleting million rows (assume that 70% rows are going to be deleted), why don't you copy the required rows (30%) into another table (temp), and then truncate the original table. Later insert from the new table (temp).

    This could be faster.

  4. #4
    Join Date
    Nov 2000
    Posts
    48
    Thanks for all suggestions. We are really thinking of the second option of creating a temp table and inserting records into it which we dont want to delete and truncate the original table and do the insert again.

    I tried with the PL/SQL procedure that I mentioned earliar, it really worked well. We could reduse the delete time from 1.5 hrs to 12 mins. That was incredial gain in the performance.

    Pl let me know if anybody has any more suggestions.

    Thanks once again.

    Rajesh

  5. #5
    Join Date
    Sep 2000
    Posts
    384
    If the table is very big do a partition on them .Be careful in selecting the key column on which you want to create the partition table.

    here you can have job split into 1/6 to 1/10 .

    There can be option where in you can for truncate of a partition also.

    Try them .
    Radhakrishnan.M

  6. #6
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    You can try export with query option, in which you export required rows only using direct option.

    Truncate the table.

    Import the data back into the table.

  7. #7
    Join Date
    Sep 2000
    Posts
    384
    Tamil query exports takes a long time to export ...

    Any suggestions...
    Radhakrishnan.M

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width