Re: Very Large Database, dont know what my options are

From: R. Schierbeek <byteNospamLife_at_gmail.com>
Date: Thu, 23 Oct 2008 20:25:56 +0200
Message-ID: <b15f4$4900c1df$5ed16949$1325@cache6.tilbu1.nb.home.nl>


<victor.s.email_at_gmail.com> wrote
> Hi,
> Im a newbie DBA and a little bit stuck with a problem with an
> Oracle Database and I would appreciate your help. I have a database
> of billions of records. I need to dump a set of data for
> investigation. I was using SELECT * OUTFILE... Because of the size of
> the DB and lack of indexes (data model was poorly designed from the
> get-go), the query ran for 4 hours, and I was cut off as Oracle has
> setup the profile of every user to only allow a query to run for 4
> hours. So I was wondering
>
> a) Without creating indexes, is there a way I can do a plaintext
> file dump of a set of data from a table more efficiently? If not,
> other than changing the user profile to allow unlimited query cpu/
> session resource, are there any other means?
>
> b) Im anticipating that the data the analyst needs to investigate
> in will have to stored within a table of some sort for querying.
> Aside from creating indexes on that table, what other efficiency
> improver would you advice to make sure that this new table won't take
> forever to query? Im thinking about denormalizing the table.
> Thanks,
> Victor

hello Victor
If you're spooling the SELECT * OUTFILE. from sqlplus, try:

   SET trimspool on termout off

before your SELECT.
You might me surprised at the speed improvement.

cheers

   Roelof Schierbeek Received on Thu Oct 23 2008 - 13:25:56 CDT

Original text of this message