DBAsupport.com Forums - Powered by vBulletin
Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: Ora-04030:

  1. #1
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840

    Ora-04030:

    Hi Friends,
    I am using BULK COLLECT and BULK UPDATE in a stored proc which is selecting 1.5 Million records and updating the same. When i run this stored proc i am getting the below error

    ORA-04030: out of process memory when trying to allocate 283468
    bytes (callheap,DARWIN).

    I am runing oracle 8.1.7.4, solaris 2.8 and have PHYSICAL RAM of 22 GB.

    I went to metalink and found some solutions like increase the ulimit UNIX RESOURCE to the process and reducing the SORT_AREA_SIZE.

    I tried this but still its failing.

    ulimit -a on my unix box is
    time(seconds) unlimited
    file(blocks) unlimited
    data(kbytes) unlimited
    stack(kbytes) 119200 ----this was earlier 8KB
    coredump(blocks) 0
    nofiles(descriptors) 1024
    memory(kbytes) unlimited

    and sort_area_size for the session is set to 1 MB.

    Can anyone of you suggest me a solution if you had encountered the same problem and fixed it.
    anandkl

  2. #2
    Join Date
    Jun 2000
    Location
    Madrid, Spain
    Posts
    7,447
    when you use BULK COLLECT you should limit your arraysize, i.e instead of collecting 1.5 million rows and process it you should do array fetch of 100

    something like

    FETCH ... BULK COLLECT INTO ... [LIMIT rows];

    if you try to store 1.5 million rows in an array you probably get worse performance and too large PGA or the problem you are having, out of memory

  3. #3
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840
    Hi Pando,
    That was really awsome, 1.5 Million rows got updated in 12 Min.
    But i have one question, if i set the LIMIT, and if the cursor is not able for find the LIMIT rows inside a loop then it would exit out of the loop and that many records would not be updated, so how do i take care of those records.

    Thanks a lot once again.

    Regards
    anandkl
    anandkl

  4. #4
    Join Date
    Jun 2000
    Location
    Madrid, Spain
    Posts
    7,447
    Hmm are you sure? I rememeber when I used it last year it fetches everything even the number of rows in the last fetch is not 100

  5. #5
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840
    Hi Pando,
    Yes, if the last fetch is less then 100 it close the loop. Also if i am just using arrays and not using any CURSOR(select statement), how to i set LIMIT clause.As you can see in the code below i am not using andy FETCH clause.
    For example i have a code this.

    Regards
    anandkl


    PROCEDURE add_cust
    (
    p_cust_id_array IN NUM_VARRAY,
    p_words_array IN CHAR_VARRAY,
    p_cat_id_array IN NUM_VARRAY,
    p_no_of_occurence_array IN NUM_VARRAY,

    p_error_code OUT NUMBER,
    p_error_text OUT VARCHAR2
    )
    IS

    l_idx number;

    BEGIN

    p_error_code := 0;

    if p_cust_id_array is null OR p_words_array is null
    OR p_cat_id_array is null OR p_no_of_occurence_array is null then
    p_error_code := 1;
    p_error_text := 'Could not insert KR_DATA '|| SQLERRM;
    return;
    end if;

    -- 1. loop through varrays and insert into kr_data table..

    if p_cust_id_array.FIRST is not null then
    FORALL l_idx IN p_cust_id_array.FIRST..p_cust_id_array.LAST
    insert into cust_data_temp
    (cust_ID,
    WORDS,
    CATEG_ID,
    NO_OCCURENCE)
    VALUES
    (p_cust_id_array(l_idx),
    p_words_array(l_idx),
    p_cat_id_array(l_idx),
    p_no_of_occurence_array(l_idx));
    else
    p_error_code := 2;
    p_error_text := 'input size zero ' || SQLERRM;

    end if;

    EXCEPTION
    WHEN OTHERS THEN
    p_error_code := -1;
    p_error_text := 'could Not Add cust Datas ' || SQLERRM;
    ROLLBACK;
    RETURN;

    END add_cust;
    anandkl

  6. #6
    Join Date
    Jun 2000
    Location
    Madrid, Spain
    Posts
    7,447

  7. #7
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840
    Hi Pando,
    Is it necessary to commit inside the bulk fetch to avoid PGA error.

    regards
    anandkl
    anandkl

  8. #8
    Join Date
    Jun 2000
    Location
    Madrid, Spain
    Posts
    7,447
    no, you dont have to commit

  9. #9
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840
    Hi Pando,
    I have an forall BULK update, but the updates did not commit.

    Earlier i used commit and i was able to see the updated info.

    Now i am trying to run the same stored proc again without commit.



    regards
    anandkl
    anandkl

  10. #10
    Join Date
    Aug 2001
    Location
    chennai,bangalore
    Posts
    840
    Hi Pando,
    I am able to see the updated info in the session where in run the BULK update procedure without commit in the stored proc, but when i open a new session and try to query for updated info, it shows null value.

    So, its a must that COMMIT be provided inside or outside the BULK update stored proc for changes to be seen over the entire database.

    Please correct me if am wrong.

    regards
    anandkl
    anandkl

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width