Need your expertise to avoid heavy rollback. Also I dont want to impact online users.
We have a table with more than 10Million rows. Need to scan through all the rows to find the qualified records on certain criteria every month. I will not be updating this table but writing the qualified IDs into different table to process further down the line. Except Primary Key (Num 8) I dont have any other fields to separate the data in to groups. The processor can take 6-10 hours.( Rough guess). What is the best way to read all records and process them. One way (May not be efficient) I can think of is to break the PK IDs into various ranges and have another create table created with ranges. Then process each range one after another so that you will not be holding all 10M records in ur cursor.

Is this the only way or any other efficient way we can handle it?

Thanks alot in advance and I appreciate your help.