Considing performance, should I rebulid index and execute DBMS_UTILITY.ANALYZE_SCHEMA
DBAsupport.com Forums - Powered by vBulletin
Results 1 to 6 of 6

Thread: Considing performance, should I rebulid index and execute DBMS_UTILITY.ANALYZE_SCHEMA

  1. #1
    Join Date
    May 2002
    Posts
    34
    Considing performance, should I rebulid index and execute DBMS_UTILITY.ANALYZE_SCHEMA periodically?

  2. #2
    Join Date
    Feb 2000
    Location
    Singapore
    Posts
    1,758
    It is advisable, specially if the records are being deleted from the table.

    Sanjay

  3. #3
    Join Date
    Aug 2000
    Location
    Shanghai
    Posts
    433
    If you are using Rule based optimizer , You don't need to analyze the schema . But if you are using choose / cost based optimizer . You should do that . the frequence is depandent on the transaction volume .
    1. www.dbasupport.com
    2. www.dbforums.com
    3. www.itpub.net
    4. www.csdn.net
    5. www.umlchina.com
    6. www.tek-tips.com
    7. www.cnforyou.com
    8. fm365.federal.com.cn
    9. www.programmersheaven.com
    10.http://msdn.microsoft.com/library/default.asp
    ligang1000@hotmail.com

  4. #4
    Join Date
    Aug 2000
    Location
    Belgium
    Posts
    342
    When you rebuild you indexes, you can state the 'compute statistics' clause.
    This will calculate the statistics while rebuilding the index.

    Gert

  5. #5
    Join Date
    Feb 2002
    Posts
    13
    Here is a query which can be of assistance in determining whether to rebuild an index. It is based off analyzed statistics

    column index_name format a59
    select /*+ ordered */ u.name ||'.'|| o.name index_name,
    substr(to_char(100*i.rowcnt*(sum(h.avgcln)+11)/
    (i.leafcnt*(p.value-66-i.initrans*24)),'999.00'),2)||'%' density,
    floor((1-i.pctfree$/100)*i.leafcnt-i.rowcnt*(sum(h.avgcln)+11)/
    (p.value-66-i.initrans*24)) extra_blocks
    from sys.ind$ i, sys.icol$ ic, sys.hist_head$ h,
    (select kvisval value
    from x$kvis
    where kvistag = 'kcbbkl' ) p, sys.obj$ o, sys.user$ u
    where i.leafcnt > 1
    and i.type# in (1,4,6) -- exclude special types
    and ic.obj# = i.obj#
    and h.obj# = i.bo#
    and h.intcol# = ic.intcol#
    and o.obj# = i.obj#
    and o.owner# != 0
    and u.user# = o.owner#
    group by u.name, o.name, i.rowcnt, i.leafcnt, i.initrans, i.pctfree$, p.value
    having 50*i.rowcnt*(sum(h.avgcln)+11)<(i.leafcnt*(p.value-66-i.initrans*24))
    *(50-i.pctfree$) and floor((1 - i.pctfree$/100) * i.leafcnt - i.rowcnt *
    (sum(h.avgcln) + 11) / (p.value - 66 - i.initrans * 24)) > 0
    order by 3 desc, 2
    /

  6. #6
    Join Date
    Feb 2001
    Posts
    290
    The Answer is YES.

    If you are seeing a high DML activity , its always advisable to feed the COST based OPTIMIZER , which uses the statistics on the objects like tables and indexes.

    If your DB is running on 8i , Oracle suggests to use
    DBMS_STATS package for various advantages , instead of analyze command , and also never analyze the SYS schema.

    Basically all the queies executed against the dictionary tables will use the RULE based OPTIMIZER. I heard Oracle is having a plan to change this to COST based.. let see

    Hope this gives an idea >>>
    Madhu Reddy
    xdollor@yahoo.com

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width