Deleting Large No. of Rows
DBAsupport.com Forums - Powered by vBulletin
Page 1 of 2 12 LastLast
Results 1 to 10 of 17

Thread: Deleting Large No. of Rows

  1. #1
    Join Date
    Jan 2001
    Location
    St. Louis. MO
    Posts
    150

    Unhappy

    Guys,

    I need to delete about 1M+ rows of data everyday from a single table, using the 'delete from table where.....' statement will take some time, and the truncate command will delete all the rows in the table.

    Can you give me suggestion on how to implement this using the lease possible time. Is there a 'truncate from table where.....' command.


    Ed

  2. #2
    Join Date
    Nov 2000
    Posts
    205
    From my knowledge there is no way to use a truncate "where"

    Good luck,
    Nirasha

  3. #3
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    If this is an ongoing issue, you may want to look at partitioned tables. You can truncate a single partition of a table.
    Jeff Hunter
    marist89@yahoo.com
    http://marist89.blogspot.com/
    Get Firefox!
    "I pledge to stop eating sharks fin soup and will not do so under any circumstances."

  4. #4
    Join Date
    Feb 2001
    Posts
    389
    PArtition by a day , lot of mantainence,and if u have dependent tables on this table , a lot more work,
    Why not write a procedure and delete.


  5. #5
    Join Date
    Jan 2001
    Location
    St. Louis. MO
    Posts
    150
    Deleting 1M+ rows whether by procedure or through any other application will consume a considerable amount of time.

    Using truncate will delete all rows in the table, and I don't want to do that.

    Here is the scenaro.
    Daily I am receiving 1M+ of inserts in the table. I just need to keep 1 months worth of data, so I need to do daily delete on that table

    Got any other idea?

    Thanks,

    Ed

  6. #6
    Join Date
    Aug 2000
    Posts
    29
    Well if you have 8i and are not worrying about recovery
    you can use the "NO LOGGING" option. I use this all the
    time and it increases delete time substantially
    scotta
    J.Scott Adams

  7. #7
    Join Date
    Apr 2000
    Location
    roma
    Posts
    131
    you can goo for exp imp (if your version is 8*) using the QUERY -option, (select clause used to export a subset of a table)...so you may import the table with the subset of records you want (or may have e test in table in import on the end of it rename to the old name)

  8. #8
    Join Date
    Dec 2000
    Location
    Ljubljana, Slovenia
    Posts
    4,439
    Originally posted by edsperez
    Here is the scenaro.
    Daily I am receiving 1M+ of inserts in the table. I just need to keep 1 months worth of data, so I need to do daily delete on that table
    This is the perfect scenario for using partitions, as Jeff allready mentioned.

    NOLOGGING doesn't help with delets at all.
    Jurij Modic
    ASCII a stupid question, get a stupid ANSI
    24 hours in a day .... 24 beer in a case .... coincidence?

  9. #9
    Join Date
    Nov 2000
    Location
    London
    Posts
    83
    As well as NOLOGGING, I use DELETE_COMMIT (while logged in as SYS) with COMMIT interval set at 100,000 (rows) or so. Here's an example:

    EXEC DELETE_COMMIT('delete /*+ (tablename index)*/ from tablename where column=constraint',interval number);

    This then will perform COMMIT at set interval, which will prevent Rollback segments going loopy, as well as saving some time.

    Fiona

  10. #10
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    It is better to change the design. Add a column, deleted_flag char(1) in the table. Daily update this column with 'Y' for the rows you want to delete. On Sunday / when there is low activity on the database, you delete 7M rows after changing the table to nologging. Please also remember that the indexes associated with the table also need to rebuild.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width