DBAsupport.com Forums - Powered by vBulletin
Page 1 of 2 12 LastLast
Results 1 to 10 of 18

Thread: Capacity planning and performance issue

Hybrid View

  1. #1
    Join Date
    Sep 2006
    Location
    London
    Posts
    58

    Capacity planning and performance issue

    Hi guys,

    Can you please provide your comments/suggestion/recommendation on the following issue?

    We are expecting the size of the daily data 1GB (per day) in one of our table in the database through ftp from another server. Gradually, it would grow upto 30 GB over a period of few months.

    This data would be transferred through UNIX script.

    The table structure is enclosed here for your reference

    Column Name Column Type
    col1 Number(15)
    col2 Varchar2(255)
    col3 Varchar2(255)
    col4 Varchar2(255)
    col5 Varchar2(255)
    col6 Number(4,2)
    col7 Number(10)
    col8 Date
    col9 Number(8,2)
    col10 Number(8,2)
    col11 Date
    col12 Varchar2(3)

    What storage parameter can I recommend to accommodate this high growth of data at table or tablespace level?

    How can I check the performance of database perspective in advance. i.e. CPU, Memory, I/O etc.( may be using iostat, vmstat, mpstat, sar)?

    In another way, what would be the impact on CPU , I/O, Memory?

    What would be the impact on the overall database?

    What precautionary actions are required to running database efficiently?

    What other things should be taken into consideration before downloading or transferring data from another server.

  2. #2
    Join Date
    Mar 2002
    Posts
    534
    What do you plan to do with this data once they are loaded?
    or
    What kind of application is using these data?

  3. #3
    Join Date
    Sep 2002
    Location
    England
    Posts
    7,334
    and how do you plan to load the data

  4. #4
    Join Date
    Aug 2002
    Location
    Colorado Springs
    Posts
    5,253
    Is the data ever going to be changed/deleted?
    David Aldridge,
    "The Oracle Sponge"

    Senior Manager, Business Intelligence Development
    XM Satellite Radio
    Washington, DC

    Oracle ACE

  5. #5
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    30 GB is not a big table today.

    Ultimately, what % of rows going to be retrieved by the SQL only matters.

    Are you concerned with loading data into another system?

    Tamil

  6. #6
    Join Date
    Sep 2006
    Location
    London
    Posts
    58
    It is required to retrieve data for reporting purpose. This is our reporting server.

    The script is written to transfer data first from one server to another through ftp.

    The data would load thorugh script. It would remain static over a period of time.

    Yes, I have concern with loading of data but the script is already written for populating data. It would be populating into one of the table using my database.

    How to estimate the size of this object and what actions are required to run this job smoothly over a period of time.

    thanks in advance.

  7. #7
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    Quote Originally Posted by dbajay
    It is required to retrieve data for reporting purpose. This is our reporting server.

    The script is written to transfer data first from one server to another through ftp.

    The data would load thorugh script. It would remain static over a period of time.

    Yes, I have concern with loading of data but the script is already written for populating data. It would be populating into one of the table using my database.

    How to estimate the size of this object and what actions are required to run this job smoothly over a period of time.

    thanks in advance.
    You need to collect performance statistics over a period of time and analyze the pattern and growth.

    Another thought ......

    Have you considered transportable tablespace option?


    Tamil

  8. #8
    Join Date
    Sep 2002
    Location
    England
    Posts
    7,334
    and what method does this 'script' use

  9. #9
    Join Date
    Aug 2002
    Location
    Colorado Springs
    Posts
    5,253
    If it's going to be readonly then use pctfree 0 and compress the data on insert (use direct path load and set the table to COMPRESS). Do not use ASSM. Consider ordering the data prior to insert by common reporting predicates so as to improve the clustering factor on those columns' indexes and reduce physical i/o on retrieval.

    Estimate the size by loading a representative sample and measuring it.
    David Aldridge,
    "The Oracle Sponge"

    Senior Manager, Business Intelligence Development
    XM Satellite Radio
    Washington, DC

    Oracle ACE

  10. #10
    Join Date
    Sep 2006
    Location
    London
    Posts
    58
    Thanks for your valuable input. We will use SQL loader within script so that transportable tablespace would not help much.

    I agrre that PCTFREE should be zero. what is the ideal size of PCTUSED. It is static data.

    I belive that table should be created with nologging option to improve performance also.

    The partitioning concept on date column would be also useful.

    Can you pl. make your comment on it?

    Can you pl. give me more idea technically how to perform and achive it for your below suggestion?

    Consider ordering the data prior to insert by common reporting predicates so as to improve the clustering factor on those columns' indexes and reduce physical i/o on retrieval.

    Estimate the size by loading a representative sample and measuring it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width