We have a DB which has around 500 tables and they are in first normal form. That is why we have deadly performance on productional environment. Full tables scan is common. The biggest table has 264 fields, 15 indexes which include 108 table fields. At the beginning as I understood they was trying to create db in only one table, and just added new fields and indexes. Our product is huge and rebuild everything is very difficult as everything must be done from the beginning. Sort operations takes by 15M number of opened cursors exceed 450 for one user.
Problem is next, I don't know how to improve performance without rebuilding whole structure and I don't know how to explain that to my boss.
Any advices would be very nice.
[Edited by kgb on 11-07-2001 at 12:44 PM]
Justify the cost of further normalising database Vs long term performance gains.
If being online is an issue i.e availability 99.999% or similar. Most hardware vendors will provide machines on a short term lease arrangement. The added benefit is that most data and rebuilding/testing can then happen in parallel.
If you have a spare machine, go ahead and normalise the application tables, and then run some benchmark queries, record the results. If you have time, also run some queries which would be impossible in the current database.
This will show your boss:
1) You are proactive in tackling issues
2) Hard data on timings = time saved = $$$ saved.
3) The extra information available by being in the new structure.
The problem with orginal table structure can be solved either by views or materialised views depending on your Oracle version i.e. the application will never know the difference.
This also gives more time to recode the app if required.