DBAsupport.com Forums - Powered by vBulletin
Results 1 to 4 of 4

Thread: Very big Update

  1. #1
    Join Date
    Aug 2000
    Location
    Ny
    Posts
    105
    I have a table have about 2,000,000 records in it and I need to update the column SEC_FLAG with 'Y' for all the duplicate values of IP, NAME, DATE_CLICK. I tryed this query, i tryed using collections in pl/sql, but it takes very very long time, any ideas or suggestions?

    Thanks.

    UPDATE MY_TABLE a SET

    a.sec_flag = 'Y'

    WHERE a.ROWID >

    (SELECT MIN(b.ROWID)

    FROM MY_TABLE b

    WHERE a.ip = b.ip AND

    a.name = b.name AND

    a.date_click = b.date_click);


  2. #2
    Join Date
    Jun 2000
    Location
    Conway,AR,USA
    Posts
    29
    You can do that in two ways.
    1. Set a counter, run the update for each row, increment the counter for each update. Commit after 2000 rows.
    Run the entire process in a loop in pl/sql block

    2. Ask the dba to set a rollback segment (say big_rbs)specifically for this update. For the particular transaction set the rollback segment to the onse set(big_rbs)
    Soumya
    still learning

  3. #3
    Join Date
    Oct 2000
    Posts
    123
    Try this:

    Create indexes on the columns in "where" clause, it will speedup a lot, i have done that before.

    Take care

  4. #4
    Join Date
    Feb 2001
    Posts
    103
    Write a plsql block. Count the number of records and update the records who have count greater than 1.I have tried. It works.
    When the going gets tough, the tough gets going

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width