Home » SQL & PL/SQL » SQL & PL/SQL » Special character handling in PLSQL (Oracle 9i)
Special character handling in PLSQL [message #286039] Thu, 06 December 2007 06:16 Go to next message
Messages: 1
Registered: December 2007
Location: India
Junior Member
There is .CSV file which is referred as BLOB in Table column. This needs to be read and Blob gets converted in Clob. This file contains special characters in European language like Latin. This character is getting converted while doing a call UTL_RAW.CAST_TO_VARCHAR2. Does any one knows how to handle this?
I have checked NLS_CHRACTERSET and it is correct UTF8.
Re: Special character handling in PLSQL [message #286184 is a reply to message #286039] Thu, 06 December 2007 15:17 Go to previous messageGo to next message
Messages: 109
Registered: November 2005
Location: Idaho
Senior Member
How are you loading the data from the CSV file...SQL*Loader? Just for clarification, are you loading the binary data into a BLOB column then copying/converting the BLOB data into a second CLOB column or just loading the binary data directly into a CLOB column?
Re: Special character handling in PLSQL [message #286493 is a reply to message #286184] Fri, 07 December 2007 10:40 Go to previous message
andrew again
Messages: 2577
Registered: March 2000
Senior Member
If you are correclty loading into a BLOB, there will be no characterset conversion taking place and the bytes in the BLOB will exactly match your source. Unless the Unicode encoding scheme is UTF8 for the source you are loading into the BLOB, the database characterset will be irrelevant. You probably need to specify the characterset used in the original data when trying to convert to varchar.

Previous Topic: Create table as select generates ORA-03113
Next Topic: Running multiple scripts
Goto Forum:

Current Time: Fri Oct 21 10:29:28 CDT 2016

Total time taken to generate the page: 0.17858 seconds