Re: Extracting 2.5 million rows

From: Robert Klemme <shortcutter_at_googlemail.com>
Date: Thu, 07 Jan 2010 10:57:37 +0100
Message-ID: <7qlpgiFj7cU1_at_mid.individual.net>



On 01/07/2010 01:46 AM, Cliff wrote:
> Hey all,
>
> I was looking for some guidance. I need to do some stat analysis on a
> production audit table. One problem is that there are no indexes on the
> audit table and spooling via sqlplus would take years.
>
> Using Export on the server, the disk space required for the dmp file is
> about 40 G ( per the DBA ). This space is not available on the UNIX box.
>
> Would running EXP locally be a valid option?
>
> I realize that any solution is going to take a while of processing time;
> it's just that I don't want to get 100-hours in and then find a fatal flaw.
>
> Thoughts?

Another option than the ones mentioned already is to temporarily mount a file system with appropriate size on the _server_ and use Data Pump to export it on the server. Then unmount the file system or do whatever you need to do as part of your analysis.

You can also use Data Pump to copy the table from one instance to another, this avoids the large export file. Then you can create indexes and do whatever analysis you want to do on the other DB server without impacting the production system. There are however some caveats, off the top of my head the default charset of the databases should be identical.

If you do need to do this analysis repeatedly then maybe a materialized view which is refreshed manually via a DB Link could help.

Kind regards

        robert

-- 
remember.guy do |as, often| as.you_can - without end
http://blog.rubybestpractices.com/
Received on Thu Jan 07 2010 - 03:57:37 CST

Original text of this message