Bulk data update
One bulk data update against one dev database. It got 20 millions of rows to update and so generated too much arch logs. We user SQL backtrack to backup the archive logs to Tivoli and then removes them,
But the backup process is not fast enough to handle the fast archivelog generation process.
It ended up fulling the arch destination and all the databases on that server hang there.
What is the good solution for this scenario?
tune the code to produce less redo
get a bigger filesystem to hold the archive logs
As it's a dev database, you could take it out of archive log mode. Kind of defeats the object of testing the code though.
I am thinking of taking this dev out of archivelog mode.
Click Here to Expand Forum to Full Width