
Hello DBAs:
Oracle 817 on IBM AIX RS6000
Database size about 115GB
We have a table about 14GB and Indexes worth about 16GB. So this table alone is accounting for about close to 30GB. How should I go about analyzing this table and how can I find out how long this might take? Would there be performance implications when we analyze such as table. As it is this is a very important table "Inventory Audit table" in the database and is used for logging any transaction in the business (Retail Application).
Thanks, ST2000

Yep! This will most definitely affect performance during the process, and it sounds like it could take a long time.
Points to consider:
1) You should definitely use DBMS_STATS as this allows the operation to act in parallel. Use the DEGREE parameter to specify the degree of parallelism.
2) Use an estimate since this will reduce the number of rows sampled to gather some of the statistics. I often use 15%, but with an extremely big table this may still be an extremely big sample so you may want to go lower.
An alternative if you have the server space to do it is to transfer the stats from another server. ie.
1) Build a clone of your database.
2) Calculate the stats on the clone.
3) Save (export) the stats to a stats table.
4) Export stats table.
5) Import stats table to production server.
6) Load (import) stats into data dictionary from stats table.
Disco!
This will require lots of space on a development box but it will mean that the stats gathering will be almost instant as far as the production environment is concerned.
I often use this method in reverse to make sure development boxes have the same stats as production boxes, hence similar execution plans.
Cheers

Thanks for the reply Tim Hall..
Just as a quick note... since few people suggested to sample the statistics of the table..
Say I analyze 10% of the table.. (or 1.4GB)
Analyze table estimate statistics 10%;
and I make this a cron job to run every night, will this ensure me to have the entire table analyzed in 10 days or so, since the table will be dynamic aswell.. What about the indexes, part ??
When I estimate for 10% as I mentioned, what all should I be looking for?
Thanks, ST2000

Can anyone please shed some light here...
Thanks, ST2000

Hi,
I don't think that, in this way you can analyze full table.
As estimating % of Rows is Random Only.
Thatz what I understand...
For analyzing full table, you hv to use Compute Statistics.
Vijay

The Time has come ....

Yep. It's a random selection of rows. The size of the estimate is only important in as far as it dictates the accuracy of the estimate. The larger the number of rows in the table, the smaller the estimate can be and still be representative.
Cheers

The best way to analyze a table (large) is: Use row number size. But do not use percentage, because 10 pct of 1 Million rows is different from 10 % of 50 Million rows. The former takes less time than the next one.
If you just say estimate statistics, then Oracle picks up only 1064 rows from the table. The number 1064 is arrived based on the formula, defined by a Russian Mathematician. In my experience, this does not work out properly. And you know why the Russian economy is performing poorly.
I always specify the number of rows wanted to be analyzed on the table. And it worked very well for me for the last 5 years.
The command is:
analyze table estimate statistics sample
50000 rows for table for all indexed columns for all indexes ;
The 50000 sample is fair number for representing a large table or even a super large table. My guess is Oracle would pick up atlease one row from each block.

Thanks a lot Tamil.. The table that I am planning to analyze is about 45 million. I guess with your code, I am analyzing the indexes too, which is what I am looking at as well..
ST2000
Posting Permissions
 You may not post new threads
 You may not post replies
 You may not post attachments
 You may not edit your posts

Forum Rules
Click Here to Expand Forum to Full Width

