-
sql*loader help please
Hi
How to increase default(64rows) commit point strength to more rows?
I tried to set the bindsize and readsize parameters high, still only 64 rows bind array is applying.
I tried this way also, but no luck: I analyzed the table and I took average row length and I multiplied with number of rows I need to insert in one go, as bindsize value.
Please shed some light on this.
Thanks in Advance.
Nagesh
-
ROWS=%No_Of_Recs_you_want_to_reach_commit_point%.
Abhay.
funky...
"I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."
"Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"
-
Originally posted by abhaysk
ROWS=%No_Of_Recs_you_want_to_reach_commit_point%.
Abhay.
I tried this way still no luck.
Sqlldr UserID=xxx/xxx@aaa rows=100000 DATA=.\abc.txt BAD=.\abc.bad LOG=.\abc.log CONTROL=.\abc.ctl
any help..
Nagesh
-
Originally posted by gnagesh
I tried this way still no luck.
Sqlldr UserID=xxx/xxx@aaa rows=100000 DATA=.\abc.txt BAD=.\abc.bad LOG=.\abc.log CONTROL=.\abc.ctl
any help..
Do you know wass the max arraysize you can have?
I think 5000 -- Not sure.
Abhay.
funky...
"I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."
"Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"
-
Originally posted by abhaysk
Do you know wass the max arraysize you can have?
I think 5000 -- Not sure.
Abhay.
With this setting I am able to load 1483 records.
rows=10000000 bindsize=10000000
Thanks for your help. Now I will do R&D on this..
Regards
Nagesh
-
Originally posted by gnagesh
With this setting I am able to load 1483 records.
rows=10000000 bindsize=10000000
Thanks for your help. Now I will do R&D on this..
Regards
check this quote
The bind array's size is equivalent to the number of rows it contains times the maximum length of each row. The maximum length of a row is equal to the sum of the maximum field lengths, plus overhead.
bind array size = (number of rows) * (maximum row length)
where:
(maximum row length) = SUM(fixed field lengths) +
SUM(maximum varying field lengths) +
SUM(overhead for varying length fields)
funky...
"I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."
"Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"
-
Yes, I also gone through this. But I executed this in a different manner like :
I analyzed the table and I took average row length and I multiplied with number of rows I need to insert in one go, as bindsize value.
Thanks for your contribution.
Nagesh
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|