Re: Extracting 2.5 million rows

From: Helio Dias <heliovitorio_at_gmail.com>
Date: Thu, 7 Jan 2010 04:19:48 -0800 (PST)
Message-ID: <ba589001-3a83-403a-8b98-0f06cbcc046f_at_o28g2000yqh.googlegroups.com>



On Jan 6, 10:46 pm, Cliff wrote:
> Hey all,
>
> I was looking for some guidance.  I need to do some stat analysis on a
> production audit table.  One problem is that there are no indexes on
> the audit table and spooling via sqlplus would take years.
>
> Using Export on the server, the disk space required for the dmp file is
> about 40 G ( per the DBA ).  This space is not available on the UNIX
> box.
>
> Would running EXP locally be a valid option?
>
> I realize that any solution is going to take a while of processing
> time; it's just that I don't want to get 100-hours in and then find a
> fatal flaw.
>
> Thoughts?
>
> TIA,
> Cliff

Hi Cliff,
You have also a option to export directly to pipe on Unix e.g.: mknod exp.pipe p
gzip < exp.pipe > compactdmp.exp.gz &

Then you just have to EXP to exp.pipe e.g.: exp file=exp.pipe

Another good solution is import without actually do the export, using datapump and network link.

I believe that the best solution will be using import throught network link , using the WHERE clause as Jonathan told.

Regards,
Hélio Dias Received on Thu Jan 07 2010 - 06:19:48 CST

Original text of this message