-
Best approach for table refreshes
I need to refresh couple of Tables(Not all) in a schema into a different schema on a different database, say on Weekly basis.
These tables have refrerential integrities.
What would be the best approach to do the same.
1. Writing a shell script to export 40 tables and write another on the target machine to ftp the file and import it.
2. Writing into Flat file and ftp and write a program to read the file and import.
3. Write a PL/SQL program with d/b links to refesh the table periodically.
Can some body suggest the best approach for this ??
Last edited by bang_dba; 06-30-2006 at 07:00 AM.
------------------------------------------------------------------------
The most enjoyable things in the world are either Immoral or too Expensive or otherwise Inaccessible anyway
-
what are your requirements for speed etc, they will all work
you can add into there mv's as well if you wish
-
Some things to keep in mind:
How big are the tables? Are there time constraints for the refresh?
Option 1:
If the tables are relatively small, the export/import would probaly work well. This has the advantage of being quick/easy to implement.
Option 2:
You'll either need to write custom code to extract.. IxUnload can be purchased reasonable to do this... You might wish to consider this option. Especially for large objects. You'll be able to do a direct path load of the data, which is extremely quick.
Option 3:
Doable, you'll need some mechanisim to identified the modified data. This smells a lot like replication... Or if you're thinking complete refresh, then the source data needs to be read consistent across tables. Especially if your planning on committing between tables.
Ken
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|