DBAsupport.com Forums - Powered by vBulletin
Results 1 to 6 of 6

Thread: Maximum errors allowed in sql loader.

  1. #1
    Join Date
    Sep 2000
    Posts
    362
    Hi,
    Somehow the maximum errors allowed in sql loader is 50 (default). How do I increase this to unlimited.

    If you see the sample log file below it has allow all discards set. The loading stopts once it reaches 50 records in the bad file. I want it to continue and keep logging as many error in the bad file.

    Thanks
    Anurag

    SQL*Loader: Release 8.0.5.0.0 - Production on Tue May 29 17:30:28 2001

    (c) Copyright 1998 Oracle Corporation. All rights reserved.

    Control File: loaddata.ctl
    Data File: ofasweb.txt
    Bad File: ofasweb.bad
    Discard File: ofasweb.dcs
    (Allow all discards)


    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 65536 bytes
    Continuation: none specified
    Path used: Conventional

  2. #2
    Join Date
    Dec 2000
    Location
    Ljubljana, Slovenia
    Posts
    4,439
    Use ERRORS=N in your sqlldr command line or parameter file, where N is an arbitrary large number. I don't know if there is any upper limit for this parameter. As you've allready found ot, the default is 50.
    Jurij Modic
    ASCII a stupid question, get a stupid ANSI
    24 hours in a day .... 24 beer in a case .... coincidence?

  3. #3
    Join Date
    May 2001
    Posts
    70
    I agree with jmordic.

    I have set the ERRORS as high as 25,000.

  4. #4
    Join Date
    Jul 2000
    Posts
    243
    Hi anuragmin

    i did a lot in data conversion from legacy systems into oracle apps. the resone why i'm telling you this is that when i read your thred i think: why do you need so many errors? i think that you should look at your data and clean it befor you start to load. i think that too many errors in the sqlloader stage is a sign of bad data. better take care of it befor you start inserting data useing sqlloader, lok at the logic of the data creation. and, if i'm wrong in my argument, why not use the power of the sqlloader to clean the data first and then end up with errors that are importent and help you learn about potential problems in that data.

  5. #5
    Join Date
    May 2001
    Location
    Chennai
    Posts
    57
    Hi anuragmin,

    Jmodic is rite...When u issue ur sqlldr command also give this ERROR=n at the command line for specifiying ur required number.

    Well...shawish_sababa,

    I too did a lot of data transfer from Unysis MF datasets to oracle tables after importing them from Unysis. There were instances like loading in millions of records where some corrupt records tend to get in between and if we have to sort them, then we have to do the entire process rite from the begining viz, import it from dataset, check them manually to confirm the absence of corrupt records and therby the data feed in has to b stopped for time being....

    Well..in this case, we can alone get to a stand where we will b aware what r all the records that has got corrupt data and we can minimise our effort while doing the reqd operattion there by the work around won't b tampered as well...


  6. #6
    Join Date
    Sep 2000
    Posts
    362
    Thanks a lot for the help.

    shawish_sababa, the reason I need to allow high errors is beuase it will be a batch process every week which will load 6-7 million records. Manually cleaning those huge files every week is virtually impossible, instead cleaning of the rejected records and feeding them is a better idea.

    Thanks
    Anurag

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width