Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> character string buffer problem
Hi,
I'm running Oracle 8.1.5 on NT. I'm trying to write procedure that will take some data and write it out to a text file using utl_file. Unfortunately to complicate the matter, the data I want to write out is in a column of type CLOB. I understand that utl_file can only handle up to 32767 characters, so I'm substringing the clob column.
Here's the code:
declare
v_file_handle UTL_FILE.FILE_TYPE;
big_field varchar2(32767);
begin
select dbms_lob.substr(contents,32767,1) into big_field from document where document_id=4063;
v_file_handle := UTL_FILE.FOPEN ('i:\clusterfiles',' test.txt','w',32767);
UTL_FILE.PUT_LINE(v_file_handle, big_field);
UTL_FILE.FCLOSE (v_file_handle);
end;
This results in an error: "character string buffer too small"
The problem does not appear to be in any of the utl_file referenced lines, but rather in my select or in my big_field variable declaration.
if I change my select as follows:
select dbms_lob.substr(contents,4000,1) into big_field from document where document_id=4063;
everything works.
if I change it as follows:
select dbms_lob.substr(contents,4001,1) into big_field from document where document_id=4063;
it fails.
So it appears that I am bound in this case by the database datatype constraint of varchar2(4000). However, the PL/SQL varchar2 datatype supposedly can handle up to 32767.
If I make the variable a clob, it won't work as the call to utl_file_put complains about wrong type of arguments.
Does anyone have any ideas on how to work around this?
Thanks,
Gavin
Got questions? Get answers over the phone at Keen.com.
Up to 100 minutes free!
http://www.keen.com
Received on Tue Jul 11 2000 - 00:00:00 CDT