-
Forall Bulk collect insert
CREATE TABLE DOCS(DOCUMENT_ID NUMBER,FOLDER_ID NUMBER,OWNER_ID NUMBER, CREATE_DT DATE);
5 MILLION ROWS
LDOCS IS A VIEW OF DOCUMENT_ID AND FOLDER_ID
CREATE TABLE PAGE(DOCUMENT_ID NUMEBR,FOLDER_ID NUMBER)
5 MILLIONS ROWS
CREATE OR REPLACE PROCEDURE Fastestway IS
TYPE doc_tab IS TABLE OF LDOCS%ROWTYPE;
DOCS_tab doc_tab := doc_tab();
start_time NUMBER;
end_time NUMBER;
X NUMBER:=0;
BEGIN
SELECT * BULK COLLECT INTO DOCS_tab FROM LDOCS;
Start_time := DBMS_UTILITY.get_time;
FORALL i IN DOCS_tab.first .. DOCS_tab.last
INSERT INTO PAGE VALUES DOCS_tab(i);
x := x + 1;
IF x = 50000
THEN
x := 0;
COMMIT;
END IF;
end_time := DBMS_UTILITY.get_time;
DBMS_OUTPUT.PUT_LINE('BULK INSERT: '||TO_CHAR(end_time-start_time));
COMMIT;
END;
/
I need to do an bulk insert and commit every 50000 rows
-
-
One this is not commiting for every 50000 rows.
It keeps on inserting and after some time it gives me end -of file communication error ...
-
When you use bulk and forall, only the next statement is executed "for all".
This is why it's not commiting: x never reaches 50000
Solution:
Alter your logic: define a cursor with your query, open this cursor, make a loop, fetch your cursor into DOCS_tab using the "LIMIT" clause. Do your insert whith forall, commit, exit when cursor not found, close cursor.
Probably 50000 is a big limit. If you still having the end-of-file error, use a smaller limit, try 10000, 1000, 100 until your process finishes ok.
Last edited by ViniciusPacheco; 02-14-2008 at 09:23 AM.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|