-
Per Row trigger to access data from other tables in same session..
Hi,
I am trying to setup a trigger on table A to read data from table B and save to table C.
The insert is done on both table A and B on the same session/commit.
It seem that the trigger on table A does not see the data that is committed on table B at that time.
Here is a sample test case;
create table t1 (
pk number,
t1c1 number,
t1c2 number
);
create table t2 (
pk number,
t2c1 number,
t2c2 number
);
create table t3 (
t3c1 number,
t3c2 number
);
create or replace
TRIGGER T_TEST_T1
after insert or update on t1
for each row
begin
insert into t3 (t3c1,t3c2) values ( :new.t1c1, :new.t1c2) ;
insert into t3 (t3c1,t3c2) ( select t2c1,t2c2 from t2 where pk = :new.pk) ;
exception
WHEN OTHERS THEN
null;
end;
/
insert into t1(pk,t1c1,t1c2) values (0,1,2);
insert into t2(pk,t2c1,t2c2) values (0,10,20);
commit;
select * from t3;
drop trigger t_test_t1;
drop table t1;
drop table t2;
drop table t3;
At this point table 3 should have 2 rows, but it only has 1 as the trigger does not see the row that was inserted in t2 so
select t2c1,t2c2 from t2 where pk = :new.pk
does not exist yet when the trigger is fired.
T3C1 T3C2
---------- ----------
1 2
and I should see
T3C1 T3C2
---------- ----------
1 2
10 20
Is there any way to get this to work?
-
It seems to me that table "C" should be a view that queries data from tables "A" and "B", rather than storing duplicate data in a separate place. If you have the tables properly setup, it should not cost that much to query from table "A" and "B". It should be much cheaper that what you are proposing.
-
In fact the whole setup is much more complicated than this test case.
We are talking about a query that uses about 5 tables.
Here are the number of relevant row per "test"
table 1: 1 row
table 2: 8 rows
table 3: 125 rows
table 4: 624 rows
table 5: 2197 rows
table 1 => table 2 => table 3 => table 4 => table 5
the data is not really duplicated as it is calculated, it is a mix of avg, sum, max, min, .. then consolidated into a "flat" table for reporting.
-
Why don't you use a materialized view. They can be setup to do incremental updates. Doing row level triggers to update flat reporting tables might cause a lot of overhead. If you create log tables under each of the 5 tables, you should be able to then create a materialized view with incremental update.
-
A friend of mine pointed out something here in the order of the events;
insert 1 -> triggers -> insert 2
I have confirmed this by changing the inserts from my sample code as follow;
insert into T2(PK,T2C1,T2C2) values (0,30,40);
insert into t1(pk,t1c1,t1c2) values (0,1,2);
insert into T2(PK,T2C1,T2C2) values (0,10,20);
delete from T2 where T2C1=30;
commit;
and here is what i got;
1 rows inserted.
1 rows inserted.
1 rows inserted.
1 rows deleted.
committed.
T3C1 T3C2
---------- ----------
1 2
30 40
That seem to show that the "commit" has nothing to do with my trigger at this point. Also that any operation done before the trigger should be available, but the operation after, even before the commit, are not.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|