Hi guys,

Can you please provide your comments/suggestion/recommendation on the following issue?

We are expecting the size of the daily data 1GB (per day) in one of our table in the database through ftp from another server. Gradually, it would grow upto 30 GB over a period of few months.

This data would be transferred through UNIX script.

The table structure is enclosed here for your reference

Column Name Column Type
col1 Number(15)
col2 Varchar2(255)
col3 Varchar2(255)
col4 Varchar2(255)
col5 Varchar2(255)
col6 Number(4,2)
col7 Number(10)
col8 Date
col9 Number(8,2)
col10 Number(8,2)
col11 Date
col12 Varchar2(3)

What storage parameter can I recommend to accommodate this high growth of data at table or tablespace level?

How can I check the performance of database perspective in advance. i.e. CPU, Memory, I/O etc.( may be using iostat, vmstat, mpstat, sar)?

In another way, what would be the impact on CPU , I/O, Memory?

What would be the impact on the overall database?

What precautionary actions are required to running database efficiently?

What other things should be taken into consideration before downloading or transferring data from another server.