Re: Extracting 2.5 million rows

From: Jonathan Lewis <>
Date: Thu, 7 Jan 2010 10:37:08 -0000
Message-ID: <>

<Cliff> wrote in message
> Hey all,
> I was looking for some guidance. I need to do some stat analysis on a
> production audit table. One problem is that there are no indexes on the
> audit table and spooling via sqlplus would take years.
> Using Export on the server, the disk space required for the dmp file is
> about 40 G ( per the DBA ). This space is not available on the UNIX box.
> Would running EXP locally be a valid option?
> I realize that any solution is going to take a while of processing time;
> it's just that I don't want to get 100-hours in and then find a fatal
> flaw.
> Thoughts?
> TIA,
> Cliff

How are you planning to do the analysis ?

If you're looking at exp then presumably you're thinking of using imp on another database. Have you considered the possibility of using transportable tablespaces, or do the restrictions make that option impossible ?

If you're going to use exp "locally" - which in your case means at a client machine - remember that you're going to have to pump all that data across a network, and that may introduce a different set of problems, and need some adjustments to your SQL*Net configuration before you begin.

If SQL*Net issues don't worry you, and you're going to be moving the datat into another Oracle database, then how about:

     create audit_clone nologging
    select (required columns) from audit_table_at_remote ?

If you're sticking with exp/imp remember that you can add a "where clause" to exp, so if there's some easy way to break the table into discrete data sets you could export it in several pieces - you'd have to scan the entire table to generate each file, but if your critical limit is the amount of filesystem space you can acquire at any one time, it gives you an option for doing the job in pieces.


Jonathan Lewis
Received on Thu Jan 07 2010 - 04:37:08 CST

Original text of this message