DBAsupport.com Forums - Powered by vBulletin
Results 1 to 3 of 3

Thread: using multiple dump files in export while compressing

  1. #1
    Join Date
    Oct 2000
    Posts
    76
    We compress(while running the export, not compressing afterwards) our export dump file during the export run so it could stay below the 2G limit. Now even the compressed file exceeds 2G. I know in 8i you can use more than one dump file by specifying in file=exp.dmp1,exp.dmp2... But how do I do this while compressing at the same time?

    The part of script we use now is:

    mknod $HOME/$DAY$1 p
    compress -f <$HOME/$DAY$1> $HOME/$DAY$1.Z&
    (
    exp parfile=$HOME/genexp$1.par file=$HOME/$DAY$1 log=$HOME/exp$1.log$XDAY
    )

    So basically

    mknod file.dmp p
    compress -f file.dmp.Z&
    (
    exp parfile=parfile file=file.dmp
    log=logfile
    )

    So how do I add a second dmp file to this script so I can have a second compressed dmp file when the first one exceeds 2G?




    J.T.

  2. #2
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    Ah, time for some smoke and mirrors.

    We got around this problem by starting several named pipes and then accessing those named_pips in the files= clause.

    For example:
    Code:
    #setup the pipes and file list
    NUM_NP="1 2 3 4 5"
    for i in $NUM_NP
       rm -f /tmp/exp_${i}
       /etc/mknod /tmp/exp_${i} p
       DMP_FILE=${i}.Z
    
       compress < /tmp/exp_${i} > $DMP_FILE &
    
       if (( $i == 1 )); then
          FILE_LIST=/tmp/exp_${i}
       else
          FILE_LIST=$FILE_LIST","/tmp/exp_${i}
       fi
    done
    
    #do the export
    exp / file=$FILE_LIST filesize=8000M  ...
    
    # kill any pipes that didn't get used
    for i in $NUM_NP; do
       kill %$i
    done
    We had to futz with the 8000M to get out .Z files somewhere around 2G, but we got pretty close.

    [Edited by marist89 on 07-10-2001 at 04:44 PM]
    Jeff Hunter

  3. #3
    Join Date
    Feb 2001
    Posts
    123
    Why not use the split command to break the compressed dmp file into manageable chunks?

    for example (requires lots of tidying, path names etc.)...

    mknod exp_pipe p
    mknod compress_pipe p

    # start a compress process, listening on the named pipe
    # that the export process will write to, output the compressed
    # export to another pipe for 'split' to pick up. Note, start
    # process in background, so it can sit and wait for the export
    compress < exp_pipe > compress_pipe &

    # start a split process, listening on the named pipe that the
    # compress process writes to. This process will split the
    # incoming data into 2047MB chunks, called xaa, xab, xac etc.
    # see the man page on split for how to change the output file
    # names.
    split -b2047m < compress_pipe &

    # perform the export, writing to the first named pipe
    exp / file = exp_pipe ...


    HTH

    David.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width