-
Ora-04030:
Hi Friends,
I am using BULK COLLECT and BULK UPDATE in a stored proc which is selecting 1.5 Million records and updating the same. When i run this stored proc i am getting the below error
ORA-04030: out of process memory when trying to allocate 283468
bytes (callheap,DARWIN).
I am runing oracle 8.1.7.4, solaris 2.8 and have PHYSICAL RAM of 22 GB.
I went to metalink and found some solutions like increase the ulimit UNIX RESOURCE to the process and reducing the SORT_AREA_SIZE.
I tried this but still its failing.
ulimit -a on my unix box is
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 119200 ----this was earlier 8KB
coredump(blocks) 0
nofiles(descriptors) 1024
memory(kbytes) unlimited
and sort_area_size for the session is set to 1 MB.
Can anyone of you suggest me a solution if you had encountered the same problem and fixed it.
-
when you use BULK COLLECT you should limit your arraysize, i.e instead of collecting 1.5 million rows and process it you should do array fetch of 100
something like
FETCH ... BULK COLLECT INTO ... [LIMIT rows];
if you try to store 1.5 million rows in an array you probably get worse performance and too large PGA or the problem you are having, out of memory
-
Hi Pando,
That was really awsome, 1.5 Million rows got updated in 12 Min.
But i have one question, if i set the LIMIT, and if the cursor is not able for find the LIMIT rows inside a loop then it would exit out of the loop and that many records would not be updated, so how do i take care of those records.
Thanks a lot once again.
Regards
anandkl
-
Hmm are you sure? I rememeber when I used it last year it fetches everything even the number of rows in the last fetch is not 100
-
Hi Pando,
Yes, if the last fetch is less then 100 it close the loop. Also if i am just using arrays and not using any CURSOR(select statement), how to i set LIMIT clause.As you can see in the code below i am not using andy FETCH clause.
For example i have a code this.
Regards
anandkl
PROCEDURE add_cust
(
p_cust_id_array IN NUM_VARRAY,
p_words_array IN CHAR_VARRAY,
p_cat_id_array IN NUM_VARRAY,
p_no_of_occurence_array IN NUM_VARRAY,
p_error_code OUT NUMBER,
p_error_text OUT VARCHAR2
)
IS
l_idx number;
BEGIN
p_error_code := 0;
if p_cust_id_array is null OR p_words_array is null
OR p_cat_id_array is null OR p_no_of_occurence_array is null then
p_error_code := 1;
p_error_text := 'Could not insert KR_DATA '|| SQLERRM;
return;
end if;
-- 1. loop through varrays and insert into kr_data table..
if p_cust_id_array.FIRST is not null then
FORALL l_idx IN p_cust_id_array.FIRST..p_cust_id_array.LAST
insert into cust_data_temp
(cust_ID,
WORDS,
CATEG_ID,
NO_OCCURENCE)
VALUES
(p_cust_id_array(l_idx),
p_words_array(l_idx),
p_cat_id_array(l_idx),
p_no_of_occurence_array(l_idx));
else
p_error_code := 2;
p_error_text := 'input size zero ' || SQLERRM;
end if;
EXCEPTION
WHEN OTHERS THEN
p_error_code := -1;
p_error_text := 'could Not Add cust Datas ' || SQLERRM;
ROLLBACK;
RETURN;
END add_cust;
-
-
Hi Pando,
Is it necessary to commit inside the bulk fetch to avoid PGA error.
regards
anandkl
-
no, you dont have to commit
-
Hi Pando,
I have an forall BULK update, but the updates did not commit.
Earlier i used commit and i was able to see the updated info.
Now i am trying to run the same stored proc again without commit.
regards
anandkl
-
Hi Pando,
I am able to see the updated info in the session where in run the BULK update procedure without commit in the stored proc, but when i open a new session and try to query for updated info, it shows null value.
So, its a must that COMMIT be provided inside or outside the BULK update stored proc for changes to be seen over the entire database.
Please correct me if am wrong.
regards
anandkl