sql loader - how to ignore inserting duplicate records
DBAsupport.com Forums - Powered by vBulletin
Results 1 to 8 of 8

Thread: sql loader - how to ignore inserting duplicate records

  1. #1
    Join Date
    Jan 2010
    Posts
    20

    sql loader - how to ignore inserting duplicate records

    Hi,

    Im calling sql loader recursively to load data from CSV files which has thousands of records in each file. If there are any duplicate records, the sql loader terminates with ORA00001. My query is how to ignore inserting duplicate records and continue with the load.

    Most of the posts in forums suggests to use skip command. But i do not think that is a wise option in my case as we cannot predict the maximum error count. more over I have set up ERROR=0 in my code so that the code terminates in case thers is a data error.

    Please let me if there is are any other way to ignore inserting duplicate records into the tables.

    Thanks in advance.

  2. #2
    Join Date
    Sep 2002
    Location
    England
    Posts
    7,333
    why not clean up your data so it is correct

  3. #3
    Join Date
    Jan 2010
    Posts
    20
    As per the requirement we are not supposed to do that.

  4. #4
    Join Date
    Sep 2002
    Location
    England
    Posts
    7,333
    why are you not supposed to make sure your data is correct?

  5. #5
    Join Date
    May 2002
    Posts
    2,645
    Then just load all the data and de-dupe the table. No brainer.

  6. #6
    Join Date
    Mar 2007
    Location
    Ft. Lauderdale, FL
    Posts
    3,554
    Quote Originally Posted by anijan View Post
    My query is how to ignore inserting duplicate records and continue with the load.

    Most of the posts in forums suggests to use skip command. But i do not think that is a wise option in my case as we cannot predict the maximum error count.
    That's the conundrum... you cannot have it both ways.

    Either you accept dups or you don't
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.

  7. #7
    Join Date
    Jan 2010
    Posts
    20
    Though ERROR parameter can used to avoid such errors, is there a way seperate data errors like invalid date or character etc and duplicate records.?

  8. #8
    Join Date
    May 2002
    Posts
    2,645
    Like you said, there is no way to predict the maximum number of errors. Read the documentation and use the bad and discard files. Your overall choices are:
    1. Fix the data ahead of time
    2. Load all the data and de-dupe the table
    3. Pick a large enough value for errors

    How many times does this need to be explained?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width