DBAsupport.com Forums - Powered by vBulletin
Page 1 of 3 123 LastLast
Results 1 to 10 of 22

Thread: Archive log files growing

  1. #1
    Join Date
    Apr 2001
    Posts
    126

    Unhappy Archive log files growing

    i enabled archive log in my database and it keeps adding files.
    shortly i will run out of space.

    can i delete some file from that. (ie. the old dated filed?)



    Thanks in advance.

  2. #2
    Join Date
    Sep 2000
    Location
    Chennai, India
    Posts
    865
    Backup your archived log files to tape/drive, verify backup and then go ahead and delete them.

    On the other hand,
    - shutdown
    - take offline backup
    - verify backup
    - delete archive log files
    - startup

    HTH.

  3. #3
    Join Date
    Apr 2001
    Posts
    126
    1. do you mean i should delete LOG files directly after backup/shutting down the DB?.


    2. At the time the DB crash how do i recover DB using ARCHIVE LOG files.

  4. #4
    Join Date
    Sep 2000
    Location
    Chennai, India
    Posts
    865
    Originally posted by fordikon
    1. do you mean i should delete LOG files directly after backup/shutting down the DB?.
    What I meant is to
    - first backup the archived log files to tape
    - verify the validity of the backup
    - delete the archive log files

    The above can be done when the database is online. No need to shutdown.

    2. At the time the DB crash how do i recover DB using ARCHIVE LOG files.
    Restore the archived log files from tape to db system.
    Do recovery.

    HTH.

  5. #5
    Join Date
    May 2001
    Posts
    736
    Allow the automated scripts to delete the archive logs once they successfully backedup to tape.AS for as my self is concerned i used to keep atleast 4 days of archive logs on the hard disk as at the time of recovery reading from the disk is faster than from tape.

  6. #6
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    You will need the archive logs if ever your server is totally destroyed (the G8 hooligan fringe or genetically modified rabbits might pose a threat). To minimise loss, think about copying them, as they are created, to another machine - preferably in a different location.

    LOG_ARCHIVE_DEST_n in init.ora can do this automatically.

    http://www.dbasupport.com/forums/sho...threadid=36357 discusses what to do if you want to compress/zip them before transmitting across the WAN.

  7. #7
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    Originally posted by akhadar
    Allow the automated scripts to delete the archive logs once they successfully backedup to tape.
    OK, yes, but . . . . you might want to think through the scenario of finding that you can't read one of your tapes (been there, done that, they don't do T-shirts).

  8. #8
    Join Date
    Apr 2003
    Location
    Gourock, Scotland
    Posts
    102
    As DaPi hints at above, you will save yourself time and space (Uh-oh Star Trek mode fast approaching....) if you are able to compress your archive logs as they are created, or at least on a regular basis. For unix systems I use cron to gzip any files created every half-hour. I guess for Windoze you could set an at job to do pretty much the same thing?
    If I have to choose between two evils, I always like to choose the one I haven't tried yet.

  9. #9
    Join Date
    Nov 2002
    Location
    Geneva Switzerland
    Posts
    3,142
    Originally posted by KenCunningham
    For unix systems I use cron to gzip any files created every half-hour.
    Hi Ken, how do you make sure you don't gzip an archive log while it's still being written? gopi, in the link I gave above, has some ideas (I'm not into UNIX so I can't comment).

    (as for saving time & space - we are a long way from the Plank limit . . . )

  10. #10
    Join Date
    Apr 2003
    Location
    Gourock, Scotland
    Posts
    102
    DaPi, thanks for the response. Gopi's contribution looks the best bet to me, excluding the most recent file from the compression process, so that it is not included and therefore won't be affected if incomplete. It might also be possible to use a process such as the *nix 'fuser' to check whether any process is currently accessing the most recent file and if so, exclude it from the list of files to be compressed. This, I have to say, is untested, but now that the ideas there....where's that test box? Best regards.
    If I have to choose between two evils, I always like to choose the one I haven't tried yet.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width