Re: Questions about SQL*Developer CLI for exporting data

From: Tim Gorman <>
Date: Thu, 10 Dec 2020 19:27:42 -0800
Message-ID: <>


I'm guessing that it may not work for wide tables with lots of columns, and I'm curious how you handle RAW, LONG RAW, and BLOB data types?

I also enjoy scripting, and I had posted a similar script HERE <> for my PL/SQL package DUMP_DATA which defines a pipelined table function CSV, with a SQL*Plus script intended for testing HERE <>, along with a shell-script also using SQL*Plus HERE <>, as well as a Perl script for serious use calling it HERE <>. I don't have a Python version yet.

DUMP_DATA works really well, has several nice bells and whistles like configurable array processing, but it has a fatal flaw with wide tables.  I wrote DUMP_DATA using native PL/SQL dynamic SQL method 1 which is limited to a 4000-byte string for the entire SQL statement;  exceeding the limit results in the ORA-01489 (i.e. "result of string concatenation is too long") error.  When I realized that I'd have to rewrite DUMP_DATA to use the DBMS_SQL package with dynamic SQL method 4 (which I know very well from my PRO*C days) is when I switched over to recommending SQL Developer to the customer.  This way, the customer can go chasing after Jeff's team at Oracle for support, not me.  :)

Thanks so much for the Python code!


On 12/10/2020 6:12 PM, Mladen Gogala wrote:
> Hi Tim,
> I like scripting. I have the script that dumps Oracle tables or
> queries to CSV:
> It even allows you to specify to number of rows for the array fetch.
> It's not as fancy as SQL*Developer, but does the job.
> Regards
> On 12/10/20 10:00 AM, Tim Gorman wrote:
>> That was what I needed, thanks so much!  I had read the article, but
>> hadn't realized what the screenshot was showing; I'm still one of
>> those who looks for words; gotta adapt.

Received on Fri Dec 11 2020 - 04:27:42 CET

Original text of this message