Hi,
why does importing 150,000 records take so much time?, i tried to import a table containing 150,000 records and i took like 6 hours, is there any way or method to make such a thing in a faster way ??
Thanks for your help.
Printable View
Hi,
why does importing 150,000 records take so much time?, i tried to import a table containing 150,000 records and i took like 6 hours, is there any way or method to make such a thing in a faster way ??
Thanks for your help.
6h is strange. It should not even take an hour, not even half an hour.
Could you please show me the IMP statement you run, namely I am interested in BUFFER and the output from select * from v$rollstat;
Well 150.000 records doesn't say much about the work to be done to import that table. It could be tat he space reqired for those 150.000 records would be something like 200 GB, or it could be only 200KB.
Not to say that we don't know anything about your system configuration and your hardware horse power.
So 6 hours might be redicolously long time, or it could be an excelent time score...
well i'm not using a batch file, i'm just typing imp on the command prompt and i leave all the defaults as is, providing only the username and password, i'm home now so i can't query the v$rollstat view.
The Database is Oracle 8.1.7
Setting the buffer and commit=n parameter could improve performance.
i'm talking about a Hp 3000 Server, with 20GB of Hard Disk, but i'm not sure about the size of the datafile, as for the operating system it is Windows 2000 Server.
As Jurij said, the data might be a lot: there are row and rows. Has it taken before that long?
Try to run:
Do you disable first the table constraints and the triggers?Code:imp userid=user_name/password buffer=30000 file=name.dmp tables=table_name grants=N ignore=Y commit=Y fromuser=user_name touser=user_name
30000 might be too much or too little, I don't know, I would have prefered to see your undo segments...
Thank your for your help Julian, i will sure try that one, by the way what happened to your visit to Egypt? i thought you wanted to come, well be sure that you're always welcome.
If number of database buffers is low and shared pool size is small, importing may take forever !!!
Try to ecrease that parameters and you will see the result.
If your table do not exists before import operation, table constraints are created after data importing.
The next country I will visit is definetely Egypt!Quote:
Originally posted by hany
Thank your for your help Julian, i will sure try that one, by the way what happened to your visit to Egypt? i thought you wanted to come, well be sure that you're always welcome.
Use KGB's adice too :-)
I have found that first dropping, then building indexes manually after an import helps if you have a bunch of indexes on that table.
I noticed if you put buffer and commit you slow dwn import, I rather have a big rbs to speed up and bypass these two parameters...
Hi,
If you want to monitor what your import is doing.
You can query the dba_segments view to see if the table is still growing. (select bytes/1024/102 from dba_segments where segment_name = 'imp_table';)
You can query the dba_indexes view to see if all your indexes are there. (select index_name from dba_indexes where table_name = 'imp_table';)
Or you can see if all your constraints are enforced. (select constraint_name from dba_constraints where table_name = 'imp_table';
Just did an import of a 87 million rows table with 8 indexes and 13 constraints it took 22 hours. Seven hours to get the table data back (7.5 GB) in 13 hours to build the indexes ( 12 GB) 2 hours to enforce the constraints.
HTH
Tycho
ya hany did you solved ur problem. in ur case it should ideally take 10-15 minutes.
as mentioned in earlier posts, try to import only data skipping constraints and indexes. see the performance. still the same then see the rollback segments and log buffer size. see enought memory is allocated. actually many issues have to consdired to get a good performance. at OS level , at Database level.
hi all,
i still need help, i'm trying to import again with the parameters Julian provided me :
"imp userid=user_name/password
buffer=30000
file=name.dmp
tables=table_name
grants=N
ignore=Y
commit=Y
fromuser=user_name touser=user_name", but i'm afraid it's been importing for 30 minutes now, i queried Dba_segments and the table that i need to import doesn't seem to be growing.
Please help.
Did you disable the triggers and the table constraints before the import?Quote:
Originally posted by hany
Please help.
The table didn't exist Julian, i'm creating a new user and importing the whole schema which contains that table.
Could it be that there are actually much more than 150000 rows? Could it be that those rows are "big" as Jurij suggested. Could it be that your Network connection is slow? You mentioned that you are doing the import from home.Quote:
Originally posted by hany
The table didn't exist Julian, i'm creating a new user and importing the whole schema which contains that table.
Try Pando's suggestion: forget commit and buffer and use one big undo segment. Do you know how to do that?
yes i do, i will try that too, thanks Julian, i'll inform you if it works (or if it doesn't)
You may want to not import the indexes but build them after the import. So disable the constraints, the triggers, INDEXES=N, IGNORE=Y, COMMIT=N, 1 big undo segment and let's hope for the best :-)Quote:
Originally posted by hany
yes i do, i will try that too, thanks Julian, i'll inform you if it works (or if it doesn't)
Take a look also at Metalink Note:93763.1.