best way to delete large number of rows - Page 3
DBAsupport.com Forums - Powered by vBulletin
Page 3 of 3 FirstFirst 123
Results 21 to 29 of 29

Thread: best way to delete large number of rows

  1. #21
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    For efficiency n should be a big as possible, without blowing the RBS.
    "The power of instruction is seldom of much efficacy except in those happy dispositions where it is almost superfluous" - Gibbon, quoted by R.P.Feynman

  2. #22
    Join Date
    Sep 2001
    Location
    Düsseldorf, Germany.
    Posts
    588
    or would be bette to transfer required data to temporary table and back again?

    Code:
    SQL> SET TIMING ON
    SQL> CREATE TABLE my_table as SELECT * FROM all_objects;
    
    Table created.
    
    Elapsed: 00:00:09.86
    SQL> INSERT INTO my_table SELECT * FROM my_table;
    
    24700 rows created.
    
    Elapsed: 00:00:03.38
    SQL> /
    
    49400 rows created.
    
    Elapsed: 00:00:08.07
    SQL> /
    
    98800 rows created.
    
    Elapsed: 00:00:13.14
    SQL> /
    
    197600 rows created.
    
    Elapsed: 00:00:20.69
    SQL>  SELECT COUNT(1) FROM my_table;
    
      COUNT(1)
    ----------
        395200
    
    Elapsed: 00:00:03.85
    SQL> CREATE TABLE my_table2 AS SELECT * FROM my_table WHERE ROWNUM < 30000;
    
    Table created.
    
    Elapsed: 00:00:02.84
    SQL> 
    SQL> TRUNCATE TABLE my_table;
    
    Table truncated.
    
    Elapsed: 00:00:00.61
    SQL> INSERT INTO my_table SELECT * FROM my_table2;
    
    29999 rows created.
    
    Elapsed: 00:00:02.80
    SQL> DROP TABLE my_table2;
    
    Table dropped.
    
    Elapsed: 00:00:02.64
    SQL> 
    SQL> SELECT COUNT(1) FROM my_table;
    
      COUNT(1)
    ----------
         29999
    
    Elapsed: 00:00:00.85
    SQL>

  3. #23
    Join Date
    Jan 2003
    Location
    hyderabad
    Posts
    1
    Originally posted by abhaysk
    Select count(*) into variable from table where ur condition

    variable:=Round((variable)/(max records u think u can delete))+1

    For cnt 1 to variable loop

    delete with ur condition and with rownum < (max records u think u can delete)+1.

    Loop.

    Regards
    Abhay.

  4. #24
    Join Date
    Dec 2002
    Location
    Bangalore ( India )
    Posts
    2,434
    Dapi ;

    use of cursors for deletion is bad idea.it degrades the performance even more. & more ever u want to commit tran afer 50000 records...that is prone to SNAPSHOT too old as suggested by JMODIC.

    i think deletion using rownum & looping is , i can say, better than using cursors.

    If number of records in table that needs deletion is more than what it has to retain after deletion than better go for som TEMP table insertion of data that u want to retain.....truncate current table & insert back the records from TEMP table....

    Abhay.
    funky...

    "I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."

    "Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"

  5. #25
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    Abhay,

    My method with a cursor was born of ignorance - I did not know that I shouldn't do it!
    BUT experience shows: (a) it is faster (b) it works.
    "The power of instruction is seldom of much efficacy except in those happy dispositions where it is almost superfluous" - Gibbon, quoted by R.P.Feynman

  6. #26
    Join Date
    Dec 2002
    Location
    Bangalore ( India )
    Posts
    2,434
    Dapi:

    Be logical......
    SQL Statement will be much faster than PL/SQL block execution...

    to tell u...
    In our DW prj....we delete about 10 million records a day....and previously the deletion was done using cursors....but it took long...i can say toooo long time to delete....

    We then changed the script to wat i suggested in my post....
    It was doing much faster than wat it did with Cursors....and mor ever with cursors & commit after 5000 records-snapshot too old errors.....

    Abhay.
    funky...

    "I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."

    "Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"

  7. #27
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    Originally posted by abhaysk
    Be logical......
    SQL Statement will be much faster than PL/SQL block execution...
    I guess my watch must be broken . . . .
    "The power of instruction is seldom of much efficacy except in those happy dispositions where it is almost superfluous" - Gibbon, quoted by R.P.Feynman

  8. #28
    Join Date
    Jun 2000
    Location
    Madrid, Spain
    Posts
    7,448
    in general SQL would be faster however there are of course exceptions

  9. #29
    Join Date
    Jan 2001
    Posts
    59
    Originally posted by stmontgo
    create a new table,
    insert as select into the new table from the old,
    backup the original table,
    drop the original table
    rename the new table to the old table,
    rebuild indexes grants etc on the new table

    rbs is the least of your concerns , mass deletes
    can be a serious performance hit as such an action
    does no reset the highwater mater and subsequent
    scans will scnn up the high water mark even though
    those blocks may be empty.

    steve

    How do I backup the original table? Can you give me a sample code for the above tasks?

    Thanks!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  



Click Here to Expand Forum to Full Width