DBAsupport.com Forums - Powered by vBulletin
Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 24

Thread: statspack report

  1. #11
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    Originally posted by mike2000

    The main question is why Oracle sometimes picked up the indexes and sometimes it's not???
    http://metalink.oracle.com/metalink/...T&p_id=67522.1


    rebuilt indexes is going to help ????

    no


    could you please tell me why and how to fix this problems???

    Devise a proper index strategy or force the use of indexes with hints. In some severe cases, you can nudge the optimizer to favor indexes over a full scan, but in your case would probably cause more problems that it would help.

    I get the feeling you are looking for a quick fix for your problems. There's no silver bullet for making poorly written queries go faster, they need to be rewritten. I know it's a lot of work to go back and make them better but, unfortunately, its one of those things that has to be done sometimes.
    Jeff Hunter

  2. #12
    Join Date
    Sep 2002
    Posts
    411
    thanks all for those valuable advises.

    Is it the good thing for the client to run analyze on the tables every night.

  3. #13
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    Originally posted by mike2000
    Is it the good thing for the client to run analyze on the tables every night.
    Is that a question or a statement?
    Jeff Hunter

  4. #14
    Join Date
    Sep 2002
    Posts
    411
    Jeff,

    It's a question b/c I am not sure if this is a good idea

  5. #15
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    The biggest question you have to ask yourself is "How often does the data change?" If a significant portion of the data changes in one day, then yes it's probably a good thing to analyze every night. If it is a typical OLTP system where maybe 1% gets added every day, then analyzing once a week should be sufficient.
    Jeff Hunter

  6. #16
    Join Date
    Sep 2002
    Posts
    411
    Jeff,

    again thanks for the advise.

    another question and I am not too clear about this so if you could advise or give me the link so I can read, that would be greatly appreciate.

    when we analyze the tables in the database, should we analyze based on COMPUTE STATISTICS or ESTIMATE STATISTICS????

    thanks million.

  7. #17
    Join Date
    Mar 2002
    Location
    Manchester, England
    Posts
    202
    Look at it this way - you analyse your tables to give your database an idea of what the data 'looks like', if some tables don't change then you shouldn't have to re-analyse them (all-though if you get the chance it won't hurt) but if some tables have volatile data then they should be done more often (I think SQL server does it when over 20% of the data in a table has changed when auto stats is set to on)*. However, I'm not too sure about the pros and cons of ESTIMATE vs COMPUTE so I'm interested in some insight too :-)

    chirs

    *its been a while since I touched it so that could be rubbish :-)

  8. #18
    Join Date
    Nov 2000
    Location
    greenwich.ct.us
    Posts
    9,092
    ESTIMATE is just that, an estimate of the statistics. ESTIMATE will look at a certain portion of your data and calculate your statistics based on that data. COMPUTE, on the other hand, looks at your entire table for it's statistics. I encourage you to read http://download-west.oracle.com/docs..._opt.htm#16946 for more information.
    Jeff Hunter

  9. #19
    Join Date
    Jun 2001
    Location
    Helsinki. Finland
    Posts
    3,938
    Originally posted by hrishy
    Nope by db_file_multiblock_read_count would cause the optimizer to think that the cost of a full table scan is cheaper thats it..if you lower it from 32 to 16 then the cost based optimizer would now start thinking that the full table scan are costlier then before..your indexes would be used or not is another question all together..you will have to alter the paramter and take another statspack report and see..and post it here..

    regards
    Hrishy
    That is true.

    Something to add: On most OSs (Solaris for example) I/O block size is 64K. So, if your Oracle block size is 8K, even if db_file_multiblock_read_count is set to 16, Oracle will use a value of 8 as at most 64K can be read in a single read.

    Optimal value for db_file_multiblock_read_count is OS I/O block size devided with Oracle block size.
    Oracle Certified Master
    Oracle Certified Professional 6i,8i,9i,10g,11g,12c
    email: ocp_9i@yahoo.com

  10. #20
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    Soft Parse %: 63.25
    In general this should be more than 95 %. It shows that your application does not use BIND variables.


    Shared Pool Statistics Begin End
    ------ ------
    Memory Usage %: 72.26 90.46
    This is a good indication of your shared pool memory. The usage should not be dropped down to 70 %. If it is below 70, that means you are operationg with a large free memory for the shared pool. Also it should not exceed 90 %. Hence, you do not need to change the value.

    % SQL with executions>1: 59.26 48.01

    I do not your application (it could be OLTP or DSS or DW). But this indicates there are many static SQL statements in the library cache. Your application does not use bind variables. If your application is DSS/DW, then it is OK. But if it is OLTP, then start looking into your code.

    % Memory for SQL w/exec>1: 42.20 35.04
    This value is questionable. Hence I do not want to comment it.

    I/O
    Regarding SCATTERED READ and SEQUENTIAL READ, others have already made their suggestions.
    It is better to query from V$FILESTAT to get more info about the read and write activities on the data files.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width