DBAsupport.com Forums - Powered by vBulletin
Results 1 to 5 of 5
  1. #1
    Join Date
    Apr 2009

    Angry Increase SELECT Query performance without using indexing

    Dear All,

    I have a SELECT query which is fetching data from approx 4 tables with data in range of 90,000 to 1lac records, it is a long query with multiple inner joins between tables.The problem is taking damn long time to fetch the data.Now the problem is in 2 parts:
    1---> Out of 4, 3 tables used are not indexed there is no primary key, i know by indexing performance will increase but i cannot index the tables esp. cant add a new column with unique values in each table of such huge volumes of data so to make it primary key.I tried using non-unique(coz i cant use unique),non-clustered indexing which was not of much help.
    Now i need some resolution to increase the query performance without indexing in some other way.

    2.---> Other problem is, after i get data after a looooong patience of atleast 30 to 45 mins through running query in TOAD or SQL Developer.It is taking hell longer to get its record count or export data in excel.Its already been 4 hrs now since i have put in for exporting into excel but still i could not get it, i m sure that the database or system has not hung till now becoz the displayed count of no. of rows fetched is increasing with time.It is now 1500 rows after 4 hrs and expect the total records to be between 25000 to 30000 approx.Please tell me if there is any way to export data in excel or other format faster.

    Will be reall grateful for any assistance provided on above 2 queries.

  2. #2
    Join Date
    Sep 2002
    you can't magically increase performance by not using indexes

    you know what the solution is - create indexes

  3. #3
    Join Date
    Apr 2009
    I told already...non-unique,non-clustered indexing i have tried was not of much help...unique indexing is the last option...

    I want to know more possible solutions, if any...

    And can any one please respond on 2nd query i.e. --> how to export data to excel fast.

  4. #4
    Join Date
    Mar 2006
    Charlotte, NC
    I agree with Davey.

    depends on the CPU count in the server you can use parallel hint.
    you can alter the session and set db_file_multiblock_read_count parameter to higher value.

    You can slightly increase the response times with the above suggestions but creating proper indexes is the correct way.

    I have no idea about your second question because i never used TOAD.

    Vijay Tummala

    Try hard to get what you like OR you will be forced to like what you get.

  5. #5
    Join Date
    Mar 2007
    Ft. Lauderdale, FL
    Quote Originally Posted by kanika View Post
    I told already...non-unique,non-clustered indexing i have tried was not of much help...unique indexing is the last option...
    Either indexing strategy is wrong or business specs are asking to process a high percentaje of the rows in each table.

    Have you traced the query? do you know what is doing?

    On the other hand ...

    1- the only option to increase performance in such an scenario is by resorting to parallel hint.

    2- 1 lac is like 100,000 isn't it? ... a couple of tables in our shop have a population of in excess of 2 billion rows and we have no issues with indexes, not sure why it is not doable for in your case.
    Last edited by PAVB; 04-30-2010 at 08:06 AM. Reason: typo
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Click Here to Expand Forum to Full Width

We have made updates to our Privacy Policy to reflect the implementation of the General Data Protection Regulation.