Performance issue when import data from a text file to oracle
I encountered the following problem when I try to import data from text file to oracle in web interface.
-- I use java program to read the text file and dynamically generate the "insert" statement line by line to import the data to an oracle table. There are about 12,000 records in the text file, and it will take about 1 hour to import all the records.
May I know is the performance issue because of "dynamically generate the statement" or "the configuration of my oracle database"? And how can I resolve it.
The best solution is to use the SQL Loader utility with in Direct mode. This mode bypasses most of the Oracle database processing. It reads the datafile, converts the data into datablocks and writes them to disk. It saves lot of time and good for bulk data loading.
It depend on what your Java does with your text file.
You can find out time taken for inserting operation and time taken for creating SQL statement. When you run Java there is default autocommit after each statement. And you read line by line your text file, so most probably problem is in I/O operations.
I made SQL package which generate 'insert' statement for entire schema, the size of schema is around 600Mb, but it runs already for 3-4 weeks, weak place in my case is I/O, but I am not in a hurry.
If you are in 9i then you can use external table. It's very fast(in secs) but you cannot create index and write to this table. If you want to write to this table then you can create table with select from the external table. If it's a continuous process then I think external table will work out best.
agasimani: Yes, agree with you. I think SQL Loader should be the most efficient way for loading the data. However I need to manipulate the records when importing, like add some more columns for each record. Can SQL Loader do this? Thanks!
Originally posted by ly agasimani: Yes, agree with you. I think SQL Loader should be the most efficient way for loading the data. However I need to manipulate the records when importing, like add some more columns for each record. Can SQL Loader do this? Thanks!
Any more advice??? And how can I check if my oracle server is efficient or not? What are the important parameter for it? Thanks very much!
Use SQLLDR Utility & load data in Some TEMP Table.
Use ur logic implemented in java & write PL/SQL to manipulate data & put it into ur working permanent table.
It wud require 5 Mins(approx) to load data of 12000 records from Text file & Pl/SQL wud at the most take 10 mins to implement wat so ever ur logic to manipulate data....
"I Dont Want To Follow A Path, I would Rather Go Where There Is No Path And Leave A Trail."
"Ego is the worst thing many have, try to overcome it & you will be the best, if not good, person on this earth"
Originally posted by slimdave You are using bind variables in the insert statement, aren't you?
I use prepared statment and the insert statement I used is "insert into ... values (v1, v2, v3..)". And v1, v2, v3 ... is variables which receive the data read from the text file. Is this what your "bind variables" mean? Thanks!
Oracle allows you read-only access to data in external tables. External tables are defined as tables that do not reside in the database, and can be in any format for which an access driver is provided. By providing with metadata describing an external table, Oracle is able to expose the data in the external table as if it were data residing in a regular database table. The external data can be queried directly and in parallel using SQL.
You can, for example, select, join, or sort external table data. You can also create views and synonyms for external tables. However, no DML operations (UPDATE, INSERT, or DELETE) are possible, and no indexes can be created, on external tables.
The DBMS_STATS package can be used for gathering statistics for external tables. The ANALYZE statement is not supported for gathering statistics for external tables.
You create external tables using the ORGANIZATION EXTERNAL clause of the CREATE TABLE statement. You are not in fact creating a table; that is, an external table does not have any extents associated with it. Rather, you are creating metadata in the data dictionary that enables you to access external data.
The following example creates an external table, then uploads the data to a database table.
EXAMPLE: Creating an External Table and Loading Data
The file empxt1.dat contains the following sample data:
The following SQL statements create an external table in the hr schema named admin_ext_employees and load its data into the hr.employees table.
CONNECT / AS SYSDBA;
-- Set up directories and grant access to hr
CREATE OR REPLACE DIRECTORY admin_dat_dir
CREATE OR REPLACE DIRECTORY admin_log_dir
CREATE OR REPLACE DIRECTORY admin_bad_dir
GRANT READ ON DIRECTORY admin_dat_dir TO hr;
GRANT WRITE ON DIRECTORY admin_log_dir TO hr;
GRANT WRITE ON DIRECTORY admin_bad_dir TO hr;
-- hr connects
-- create the external table
CREATE TABLE admin_ext_employees
DEFAULT DIRECTORY admin_dat_dir
records delimited by newline
fields terminated by ','
missing field values are null
( employee_id, first_name, last_name, job_id, manager_id,
hire_date char date_format date mask "dd-mon-yyyy",
salary, commission_pct, department_id, email
LOCATION ('empxt1.dat', 'empxt2.dat')
REJECT LIMIT UNLIMITED;
-- enable parallel for loading (good if lots of data to load)
ALTER SESSION ENABLE PARALLEL DML;
-- load the data in hr employees table
INSERT INTO employees (employee_id, first_name, last_name, job_id, manager_id,
hire_date, salary, commission_pct, department_id, email)
SELECT * FROM admin_ext_employees;
Hope it helps u
Stand up for your principles even if you stand alone!