Query runtime simulation in Test Like PROD |Runtime prediction
I have a requirement as below.
We are in need to build a oracle data simulator on 11g and 12c databases.This needs to be a framework that can predict the query output timiing.
Case study is as below.
TEST Database which normaly has around 2 GB of data across
Production databases have 200 GB of source data
WE want to build a system that would allow the query to run on 2GB of data but predict the outout of 200 GB,simulating the production run stats whcih it should give from the test database itself without actually running the queries on production.
I hope i was able to explain the requirement,can you please advise what should be done to achieve this.
Can this be done by exporting the statistic and simulating the data output/load,Please help me on the approach for this soltion to build
If you have the disk space you are better off using a similar database and server to your production database. How something performs depends on how the data is stored, indexed and queried. No one will be able to give you exactly what you want. For example if you have enterprise and a data guard standby database, you can open the database every morning and use it as a read only database. Then every evening put it back into apply mode so that it can apply archive logs. You should be careful that you don't let the database apply logs while it is open read only, unless you have a license for active data guard. If you are trying to compare the query results of a 20GB database to a 200GB database its unlikely that you are going to get similar results.
You can try exporting the statistics and bringing it into test and that may help. You should lock stats on all of the tables once you do that. But there is no guarantee that it will give you the same results.
this space intentionally left blank
Click Here to Expand Forum to Full Width