DBAsupport.com Forums - Powered by vBulletin
Results 1 to 6 of 6

Thread: ORA-01114, but datafile size is 2 GB :(

  1. #1
    Join Date
    Feb 2006
    Posts
    9

    Unhappy ORA-01114, but datafile size is 2 GB :(

    Hi folks,

    I created a table named FACT_OBJ. The table is stored in a tablespace named TS_FACT_OBJ. The tabelspace consists of eight datafiles; TS_FACT_OBJ_01 to TS_FACT_OBJ_08. The size of each datafile is 2097176 KB (2 GB + 64 KB).

    I then populate the table FACT_OBJ using INSERT INTO ... (SELECT ...). The source tables, however, reside on another database, so I created a database link named DATACENT (note the FROM clause). The table populating script is as follows:

    Code:
    insert into FACT_OBJ
    (
      select
      DAT_OBJ_PJK.CODE_OP_TYPE as OPTYPE_STD_OPTYPE,
      DAT_OBJ_PJK.CODE_PROVINCE ||
      DAT_OBJ_PJK.CODE_CITY ||
      DAT_OBJ_PJK.CODE_DISTRICT ||
      DAT_OBJ_PJK.CODE_SUBDISTRICT as AREA_STD_SUBDISTRICT_CODE,
      DAT_OBJ_PJK.MAP_STATUS_OP as MAPSTATOP_STD_MAPSTAT_CODE,
      DAT_OBJ_PJK.OP_TRANSACT_CODE as TRNTYPE_STD_TRNTYPE_CODECODE,
      DAT_OBJ_BM.LAND_TYPE as JPT_STD_JPT_CODE,
      SPPT.CODE_LAND_CLASS as LANDCLS_STD_LNDCLS_CODE,
      SPPT.CODE_BUILD_CLASS as BUILDCLS_STD_BLDCLS_CODE,
      DAT_OBJ_PJK.OP_ENTRY_DATE as TIME_STD_MON_CODE,
      1 as JML_OP,
      DAT_OBJ_PJK.LAND_SQR_TOT as LAND_SQR_OP,
      DAT_OBJ_PJK.BUILD_SQR_TOT as BUILD_SQR_OP,
      DAT_OBJ_PJK.LAND_VAL as LAND_VAL_OP,
      DAT_OBJ_PJK.BUILD_VAL,
      DAT_OBJ_ANGG.LAND_SQR_WGT as LAND_SQR_WGT_OP,
      DAT_OBJ_ANGG.BUILD_SQR_WGT as BUILD_SQR_WGT,
      DAT_OBJ_PJK.OP_STREET as OP_STREET,
      DAT_OBJ_PJK.OP_BLOCK as OP_BLOCK,
      DAT_OBJ_PJK.OP_RW as OP_RW,
      DAT_OBJ_PJK.OP_RT as OP_RT,
      REF_ADM_OFF_DISTRICT.CODE_DISTRICT_OFF||REF_ADM_OFF_DISTRICT.CODE_BRANCH_OFF as CODE_BRANCH_OFF,
      DAT_SBJ_PJK.ID_SBJ_PJK as ID_SBJ_PJK
    
      from
      DAT_OBJ_PJK@DATACENT,
      DAT_OBJ_BM@DATACENT,
      DAT_OBJ_ANGG@DATACENT,
      SPPT@DATACENT,
      REF_ADM_OFF_DISTRICT@DATACENT,
      DAT_SBJ_PJK@DATACENT
    
      where
      DAT_OBJ_PJK.CODE_PROVINCE=DAT_OBJ_BM.CODE_PROVINCE(+) AND
      DAT_OBJ_PJK.CODE_CITY=DAT_OBJ_BM.CODE_CITY(+) AND
      DAT_OBJ_PJK.CODE_DISTRICT=DAT_OBJ_BM.CODE_DISTRICT(+) AND
      DAT_OBJ_PJK.CODE_SUBDISTRICT=DAT_OBJ_BM.CODE_SUBDISTRICT(+) AND
      DAT_OBJ_PJK.CODE_BLOCK=DAT_OBJ_BM.CODE_BLOCK(+) AND
      DAT_OBJ_PJK.SEQ_NUM=DAT_OBJ_BM.SEQ_NUM(+) AND
      DAT_OBJ_PJK.CODE_OP_TYPE=DAT_OBJ_BM.CODE_OP_TYPE(+) AND
      DAT_OBJ_PJK.CODE_PROVINCE=DAT_OBJ_ANGG.CODE_PROVINCE(+) AND
      DAT_OBJ_PJK.CODE_CITY=DAT_OBJ_ANGG.CODE_CITY(+) AND
      DAT_OBJ_PJK.CODE_DISTRICT=DAT_OBJ_ANGG.CODE_DISTRICT(+) AND
      DAT_OBJ_PJK.CODE_SUBDISTRICT=DAT_OBJ_ANGG.CODE_SUBDISTRICT(+) AND
      DAT_OBJ_PJK.CODE_BLOCK=DAT_OBJ_ANGG.CODE_BLOCK(+) AND
      DAT_OBJ_PJK.SEQ_NUM=DAT_OBJ_ANGG.SEQ_NUM(+) AND
      DAT_OBJ_PJK.CODE_OP_TYPE=DAT_OBJ_ANGG.CODE_OP_TYPE(+) AND
      DAT_OBJ_PJK.CODE_PROVINCE=SPPT.CODE_PROVINCE(+) AND
      DAT_OBJ_PJK.CODE_CITY=SPPT.CODE_CITY(+) AND
      DAT_OBJ_PJK.CODE_DISTRICT=SPPT.CODE_DISTRICT(+) AND
      DAT_OBJ_PJK.CODE_SUBDISTRICT=SPPT.CODE_SUBDISTRICT(+) AND
      DAT_OBJ_PJK.CODE_BLOCK=SPPT.CODE_BLOCK(+) AND
      DAT_OBJ_PJK.SEQ_NUM=SPPT.SEQ_NUM(+) AND
      DAT_OBJ_PJK.CODE_OP_TYPE=SPPT.CODE_OP_TYPE(+) AND
      DAT_OBJ_PJK.CODE_PROVINCE=REF_ADM_OFF_DISTRICT.CODE_PROVINCE AND
      DAT_OBJ_PJK.CODE_CITY=REF_ADM_OFF_DISTRICT.CODE_CITY AND
      DAT_OBJ_PJK.CODE_DISTRICT=REF_ADM_OFF_DISTRICT.CODE_DISTRICT AND
      DAT_OBJ_PJK.ID_SBJ_PJK=DAT_SBJ_PJK.ID_SBJ_PJK AND
      DAT_OBJ_PJK.CODE_PROVINCE='30' AND
      TO_CHAR(DAT_OBJ_PJK.OP_ENTRY_DATE,'YYYY')='2004'
    );

    But when running the script, I got an ORA-01114 error message:

    ERROR at line 60:
    ORA-01114: IO error writing block to file 201 (block # 524140)
    ORA-27069: skgfdisp: attempt to do I/O beyond the range of the file
    OSD-04026: Invalid parameter passed. (OS 524142)
    ORA-01114: IO error writing block to file (block # )

    Well, I was totally baffled. I mean, as far as I'm concerned, ORA-01114 and ORA-27069 usually happen when you have a very large datafile (4 GB or above). None of my datafiles exceeds 4 GB. In fact, the datafiles belong to the populated table (table FACT_OBJ ) are merely 2097176 KB each (2 GB + 64 KB).

    What's wrong? Why the ORA-01114 happened? I'm using Oracle 9.2.0.2 64-bit, running on Windows 2003 Server and Intel Itanium 64-bit processor.

    Oh, and here's the line 60. I'm not sure if it actually relates to the error, though; it is merely a join clause:
    Code:
     60  DAT_OBJ_PJK.ID_SBJ_PJK=DAT_SBJ_PJK.ID_SBJ_PJK AND

  2. #2
    Join Date
    Aug 2002
    Location
    Colorado Springs
    Posts
    5,253
    I don't really have an answer about the error you're getting, but I do notice that you're not using direct path inserts and therefore you're not using data segment compression ... you might want to consider that.
    David Aldridge,
    "The Oracle Sponge"

    Senior Manager, Business Intelligence Development
    XM Satellite Radio
    Washington, DC

    Oracle ACE

  3. #3
    Join Date
    May 2000
    Location
    ATLANTA, GA, USA
    Posts
    3,135
    You have sized the data file more than 2 GB. I suspect Oracle may not able to read beyond 2GB file size unless the system is configured with large file option.
    See below my calculation:

    SQL> select 2*1024*1024*1024 from dual;

    2*1024*1024*1024
    ----------------
    2147483648 ----> 2 GB

    SQL> select 2097176 *1024 + (64*1024) from dual ;

    2097176*1024+(64*1024)
    ----------------------
    2147573760 --->2 GB + 64KB

    My suggestion is resize the data file to 2048 MB or even lesser value.

    Tamil

  4. #4
    Join Date
    Feb 2006
    Posts
    9
    Quote Originally Posted by tamilselvan
    You have sized the data file more than 2 GB. I suspect Oracle may not able to read beyond 2GB file size unless the system is configured with large file option.
    Thanks! I indeed configured the datafile to be 2097176 KB (2 GB + 64 KB). I never thought it would have caused the ORA-01114 error; I thought the limit is 4 GB instead of 2 GB.

    By the way, such problem does not happen on 32-bit Windows. I used Oracle 9.2.0.1 with Windows 2000 and 32-bit Xeon processor, and I have no trouble with 4 GB datafiles even though I did not configure the system with large file option.

    Does such problem only happen in 64-bit Windows?

  5. #5
    Join Date
    Feb 2006
    Posts
    9
    Well I have reduced the datafile to 768 MB, but the problem persist. What's wrong? :(

  6. #6
    Join Date
    Mar 2002
    Posts
    534
    Hi,

    I just had a look at metalink. It seems to me that you should upgrade from 9.2.0.2 to, at least, 9.2.0.3 to have a certified OS/Oracle combination.


    Certify - Additional Info Oracle Database - Enterprise Edition Version 9.2 64-bit On Microsoft Windows 2003 (Itanium)
    Operating System: Microsoft Windows 2003 (Itanium) Version 2003
    Oracle Database - Enterprise Edition Version 9.2 64-bit
    N/A Version N/A
    Status: Certified

    Product Version Note:
    None available for this product.

    Certification Note:


    Patch Set 9.2.0.3.0 or higher is required.
    Existing patch sets:
    9.2.0.4.0
    9.2.0.5.0
    9.2.0.6.0
    9.2.0.7.0

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Click Here to Expand Forum to Full Width