I have done extraction of large table data (4GB table) using Oracle client (Client on NT) and with Oracle server on NT, AIX and Solaris platforms without any problem. The scripts are run on SQLPLUS and is a plain select statement extracting into a pipe-delimited txt file. The extraction went fine as long as Oracle client is on NT.
But if I use Oracle cleint on UNIX (AIX and Solaris), the extraction cuts of at 2GB without any error message.
Is there a way to get around this problem.
Thanks in advance,
There is a 2G limit on files in your UNIX environment. If available for you platform, you would need the SA to modify your file system to handle files larger than 2G.
I made sure that the system (is large file enabled : ie the filesystem to where I am writing the spool file to is large file enabled)
But oracle software is installed on a mount poit /u01 where large file support is not eanbled. but this should not cause a problem as none of the oracle files a re greater than 2GB.
But I made sure that the fialsystem to which I am spooling to
is larg file enabled.
could there be any other reason.
i can spool a 4GB table to faltfile if I use NT oracle client (with oracle server residing on AIX or Solaris).
Thanks in advance
Just spool half to one file and half to another.