I need some assistance on this issue. We are running 11g (22.214.171.124)
We have a "working table" that is empty at the beginning of the day.
Then we start adding rows (insert) with a key column called STATE with a value of 100.
At the same time, there are other apps that pickup data in state 100 , process that data and change that state to 200 or 300.
There is also another app that pickup data in state 200 , process that data and change that state to 300 or 400.
So in summary, the data on that table is at the beginning empty, then all the rows are in state 100, they slowly move to different states (200, 300, etc) and by the end of the day, they are all in 400.
My question is what would be the best way to collect stats on this table?
I was thinking to create an hourly job to collect stats on that table:
exec dbms_stats.gather_table_stats (
ownname => 'SCOTT',
tabname => 'WORK_TABLE',
estimate_percent => dbms_stats.auto_sample_size,
degree => 3,
cascade => true );
Any suggestions? Should we cancel the Oracle Standard jobs executed at 10 PM?
10-10-2013, 12:59 PM
IMHO, the best to collect stats is to delete the stats and lock stats on the table.
If the table starts and ends each day with zero rows and the cardinality changes
completely during the day, then the stats will never be correct.
10-11-2013, 01:37 AM
I've faced this before - happening on Data Warehouse staging tables that by default are truncated and loaded every day.
The solution was to figure out, over time the right stats just by looking at how queries performed. Once we figure the "right" statistics set we backed it up, locked stats and never gathered stats again.
In some other cases we created fake statistics on development database, exported and imported them into the production table.