important oracle import slow
DBAsupport.com Forums - Powered by vBulletin
Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: important oracle import slow

  1. #1
    Join Date
    Feb 2001
    Location
    alexandria
    Posts
    100

    Exclamation

    trying to import to a table in the database but the import contains 42 million rows and it is taking 60 hours to complete! it is 4gb.
    what can I do to rectify this or speed things up
    here is the script I ran to do the import

    mknod /tmp/imp_pipe_ccd p
    uncompress < expabc.dmp.Z > /tmp/imp_pipe_ccd &
    imp file=/tmp/imp_pipe_ccd userid= fromuser=abc touser=abc log=imp_abc.log commit=y ignore=y
    indexes=N buffer=64000


  2. #2
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,095
    Could be any number of things:
    1. how many CPU's do you have? The uncompress in the background could be sucking CPU
    2. Is your I/O distrubuted
    3. are you in archivelog mode?
    4. was the export direct=y?
    Jeff Hunter
    marist89@yahoo.com
    http://marist89.blogspot.com/
    Get Firefox!
    "I pledge to stop eating sharks fin soup and will not do so under any circumstances."

  3. #3
    Join Date
    Feb 2001
    Location
    alexandria
    Posts
    100

    Exclamation re

    did all that and still having a problem with the slow load

  4. #4
    Join Date
    Oct 2000
    Location
    Saskatoon, SK, Canada
    Posts
    3,925
    Was the table analyzed before export?

    Sam

    Thanx
    Sam



    Life is a journey, not a destination!


  5. #5
    Join Date
    Feb 2001
    Location
    alexandria
    Posts
    100

    Thumbs up RE:

    YES the tables were analyzed before the export

  6. #6
    Join Date
    Dec 2000
    Location
    Ljubljana, Slovenia
    Posts
    4,439
    Since you use IGNORE=Y option I presume the table exists in your destination database before you start the import. If there are indexes defined on it, this will considerably slow down the import and the option INDEXES=N will not be of much help. If this is the case drop the indexes prior to import and recreate them afterwards (probably with PARALLEL and NOLOGGING).

    Jurij Modic
    ASCII a stupid question, get a stupid ANSI
    24 hours in a day .... 24 beer in a case .... coincidence?

  7. #7
    Join Date
    Feb 2001
    Location
    alexandria
    Posts
    100

    re

    dropped the index still not much help, the table is taking too long to load. by the way does anyone know how you would extract data from a table to flat files?

    can toad do an export if the data is over 2gb

    also please help how would i recreate the index in parallel and no logging what is the command thanks a lot

  8. #8
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,136
    1 The BUFFER size is too small. Increase it to 100MB.
    2 Use DIRECT=TRUE option while importing a large data.
    3 Set the table into NOLOGGING mode.
    4 If you have multiple CPU, use PARALLEL option while creating the table and as well as importing the data into table.
    5 Copy the data file into the server where Oracle is running. If the data file is in some other server, you will see the network bottleneck.

    I saw 80% reduction in import time when I used DIRECT=TRUE option.


    [Edited by tamilselvan on 07-25-2001 at 12:23 PM]

  9. #9
    Join Date
    Jun 2001
    Location
    Charlotte, NC
    Posts
    27

    Re: re

    Originally posted by imose
    dropped the index still not much help, the table is taking too long to load. by the way does anyone know how you would extract data from a table to flat files?

    can toad do an export if the data is over 2gb

    also please help how would i recreate the index in parallel and no logging what is the command thanks a lot
    You can enter into ssqlplus and spool the information out to a file

  10. #10
    Join Date
    Feb 2001
    Location
    alexandria
    Posts
    100

    re:

    Hey guys,
    the import sped up and got completed way ahead of schedule

    apparently what happened was that two imports were concurently running. When One ended the other sped up.

    Thanks for your advice

    Now the direct=true that would be added to the import script for future reference?

    and how do you tell the buffer size to use for next time is ther e some form of calculation to use?

    Thanks everyone!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  



Click Here to Expand Forum to Full Width