Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> HELP: Oracle JDBC v8.1.5 to server v7.3.X - problems w/ insert of LONG columns

HELP: Oracle JDBC v8.1.5 to server v7.3.X - problems w/ insert of LONG columns

From: <coopman_at_ix.netcom.com>
Date: Mon, 10 May 1999 09:04:28 -0400
Message-ID: <7h6lcn$eke@sjx-ixn6.ix.netcom.com>


Your help is appreciated.

I have an Oracle 7.3.3 Server (WinNT) and I have a Java application which connects to it with
the Oracle Thin JDBC driver v 8.1.5

The application tries to do an insert of a very long string into a LONG column of a particular
table (using PreparedStatement.setAsciiStream(...) to set the column param).

With the 8.1.5 driver, if the string is > 2000 bytes I get (approx message) SQLException: Data too large for data type

With the 8.0.4.0.5 JDBC thin driver, no exception is thrown, but the data appears to have been truncated when I SELECT it to check the insert.

With the 7.3.4 JDBC thin driver, the INSERT and SELECT appear to work fine.

Additionally, I have tried to work around (in 8.1.5 driver) by using:

(1) setBinaryStream() - results in SQLException: can only insert a LONG into
a LONG column
so the driver seems confused about the column data type.

(2) setUnicodeStream() - works, but I have to do additional work to expand
the 1 byte codes
to 2 byte fixed codes in a byte array.

This seems like a bug involving confusion about what the Java driver representation of the
table column's data type is. The 2000 byte limit is specific to CHAR or VARCHAR types.

Any ideas?

Marc Received on Mon May 10 1999 - 08:04:28 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US