-
Oracle Version: 8.1.7.0.0
Tables: statistics (7.5M records) employees (2000 records)
We have a PL/SQL procedure that is running very slowly - and have determined that it's because essentially it's carrying out the following (pseudo) code:
FOR each employee
SELECT all stats where stat_id IN (a, b, c) and statistics.emp_id = employees.emp_id
SELECT all stats where stat_id IN (d,e,f) and statistics.emp_id = employees.emp_id
LOOP
What we did since was create two temporary tables - one holding all statistics with stat_id of a, b, c (this takes the record count down to 100,000. The second holds all statistics with stat_id of d,e,f (record count of 10!).
The script now runs noticably faster, but still takes a long time (having to loop through 2000 employees is not nice!). I've since heard about Global Temporary Tables and PL/SQL tables. Would these provide any benefits in this situation, or are there any other features of 8i that would speed this procedure up?
Thanks in advance,
Matt.
-
GTTs will be a solution in the case only the session filling the table needs to read from the table. Meaning Session 1 can see from the GTT only its own rows and vice versa, Session 2 cannot see what Session 1 has inserted. I've used GTTs for over a year, they have their benefits and drawbacks.
PL/SQL table (also called pseudo-table) is probably your solution.
But I suggest you check your indexes first if the code runs suspiciously slow.
Oracle Certified Master
Oracle Certified Professional 6i,8i,9i,10g,11g,12c
email: ocp_9i@yahoo.com
-
I knew I'd forgotten something in that last post
There's an index on stat_id (nonunique) but not on emp_id (the argument being that this table is frequently updated (those 7.5M records were added in the last 90 days - that's approx 85,000 per day). If we've already got an index on one field then will adding another index really hit performance that bad? Will PL/SQL tables provide better performance than a properly indexed table?
Regarding GTT's - the sessions aren't important - this table will only ever be used by this procedure (which I'm assuming executes all it's statements in the same session).
Thanks again.
Matt.
-
How often do you run this procedure? If you don't run it very
often you should create an index for emp_id and then drop it
after the procedure is ran. And you should also use matricial
processing (FORALL instead of FOR, dbms_sql.define_array,
and so on ...).
As you have the exact statistics about number of rows and
probably average row size it's simple to create an index with
storage parameters that offers precision, and then perfomance.
F.
-
I sugget you create the index on emp_id, analyze the tables and indexes invloved and run the query again.
Oracle Certified Master
Oracle Certified Professional 6i,8i,9i,10g,11g,12c
email: ocp_9i@yahoo.com
-
Examine BUL COLLECT and FORALL in PL/SQL.
I am in the process of increasing the performance of a batch job. Right now it takes 72 hours. A dramatic performance improvement is achieved after using the above features.
I found a usefule BULK COLLECT feature in PL/SQL.
A sample program is given below:
rem
rem bulk_insert.sql
rem
rem Author Tamilselvan
rem
rem Email: tamilselvang@hotmail.com
rem
rem History
rem ------------------------------------------------------------
rem Date Author Description
rem --------------------------------------------------------------
rem Mar 04, 2002 Tamilselvan Testing bulk collect, bulk insert
rem options in PL/SQL procedure
rem You need to create 2 tables, test2
rem and test3 with columns ID NUMBER ,
rem and NAME VARCHAR2(30).
rem --------------------------------------------------------------
set serverout on
declare
TYPE IdTab IS TABLE OF test2.id%TYPE ;
TYPE NameTab IS TABLE OF test2.name%TYPE ;
pid IdTab;
pname NameTab;
t1 Number ;
t2 Number ;
t3 Number ;
t4 Number ;
t5 Number ;
-- To restrict number of rows in a fetch use NATURAL data type
-- A test procedure will be developed later.
-- please contact me later
numrows NATURAL := 10000 ;
PROCEDURE get_time ( t out Number) IS
BEGIN
SELECT TO_CHAR(SYSDATE,'SSSSS') INTO t from dual;
END;
begin
execute immediate 'truncate table test2';
execute immediate 'truncate table test3';
get_time(t1) ;
-- FOR .. LOOP to insert rows in test2 table
FOR J IN 1..100000 LOOP
INSERT INTO TEST2 VALUES (J,'AAAAA'||TO_CHAR(J));
END LOOP;
COMMIT;
get_time(t2) ;
-- BULK COLLECT all 100000 rows from test2 table
Select id, name BULK COLLECT INTO pid, pname from test2 ;
get_time(t3) ;
-- Insert in test3 table from bulk collected rows
FOR J IN 1..100000 LOOP
INSERT INTO TEST3 VALUES (pid(j), pname(j) ) ;
END LOOP ;
get_time(t4) ;
-- Insert using BULK insert option in test3 table
FORALL j IN 1..100000 -- use FORALL statements (BULK INSERT)
INSERT INTO TEST3 values (pid(j), pname(j)) ;
get_time(t5) ;
commit;
DBMS_OUTPUT.PUT_LINE('FOR LOOP to insert 100000 rows in test2 table : '||to_char(t2-t1)||' Sec');
DBMS_OUTPUT.PUT_LINE('BULK COLLECT to get 100000 rows from test2 table : '||to_char(t3-t2)||' Sec');
DBMS_OUTPUT.PUT_LINE('FOR LOOP to insert 100000 rows in test3 table : '||to_char(t4-t3)||' Sec');
DBMS_OUTPUT.PUT_LINE('Bulk LOOP to insert 100000 rows in test3 table : '||to_char(t5-t4)||' Sec');
end;
/
[Edited by tamilselvan on 03-14-2002 at 02:31 PM]
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|