I have performance problem with query. I get result right but it is very slow.
I have two tables with 500,000 rows in each. The tables are hbm_matter & tbh_matter_summ. I wrote query as follows.
From hbm_matter h, tbh_matter_summ s
Where hm.matter_uno = s.matter_uno and
s.billed_fees >0 and
_agent = '08588A'
It returns 134 rows with correct information. It takes about 2 min on server. On client it takes more time than server. Actually when query runs, because of join it merges two tables & then scan for creteria. How I can make this qurey run faster? using subquery? Can anybody help me with this? Thanks in advance.
Also, what table is _agent in? It has no prefix. I'll assume it is in the h table.
The previous responses are right - make sure both columns are indexed. Then use EXPLAIN PLAN to ascertain that both indexes are being used.
Why are you testing >0? Is this an actual restriction (some are less than 0) or were you just trying to force the use of an index on a null-able column?
IF this is an actual restriction and IF both columns are indexed and IF the plan actually uses both indexes, THEN check how the tables are joined. It may be that the optimizer has chosen a NESTED LOOP join where a HASH JOIN may be better.
There are a lot of variables here so I'm going out on a limb, but here's some background. I have found that if you have two large result sets returned individually from the tables, but the intersection will be small, then a hash join is often the fasted way to join them. The hint is USE_HASH(T1,T2) and I think the ORDERED hint is also necessary. Anyway, something to think about. Oh, and we have all assumed that the column you are joining on is the PK of each table and, hence, already indexed.