-
Hi,
we have table here with has about 800 columns and 4 million records.
What is the best way in terms of extent sizes,storage parameters etc for this table.
The average row size is approximatedly 700 Bytes.
Thanks
Ronnie
ronnie_yours@yahoo.com
You can if you think you can.
-
I would need additional information before sticking my foot in my mouth.
How is the table used? Mostly reports using full table scans or transaction lookups going for 1 or two rows?
How frequently udpated? And how (row by row changes or just insert/appends?)
Of the 800 columns, are there any clusters of usage (such as there are 15 reports that use the first 20 columns, 30 reports that use the last 30 and 2 that use them all?) This number of columns automatically raises a vertical partitioning flag in my mind.
Are any of the fields BLOBS/CLOBS/Raw?
Yada Yada Yada.
Joseph R.P. Maloney, CSP,CDP,CCP
'The answer is 42'
-
Originally posted by jrpm
Answer is 42
whats that
Santosh Jadhav
8i OCP DBA
-
In my opinion, (and its purely intutional), the table must be split into several small tables. There will always be performance issues related with a table of 800 columns. Unless absolutely necessary, I think I will go with several small tables and join them through a common key. You can even go with clustering if two or more tables are routinely joined.
-
700 bytes for data in 800 columns. Are you sure??
-
Originally posted by Raminder
700 bytes for data in 800 columns. Are you sure??
Obviously the majority of trailing columns are empty (contain NULLs).
Jurij Modic
ASCII a stupid question, get a stupid ANSI
24 hours in a day .... 24 beer in a case .... coincidence?
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|