Re: Very Large Database, dont know what my options are

From: gazzag <>
Date: Wed, 22 Oct 2008 04:11:03 -0700 (PDT)
Message-ID: <>

On 21 Oct, 01:57, "" <> wrote:
> Hi,
>      Im a newbie DBA and a little bit stuck with a problem with an
> Oracle Database and I would appreciate your help.  I have a database
> of billions of records.  I need to dump a set of data for
> investigation.  I was using SELECT * OUTFILE... Because of the size of
> the DB and lack of indexes (data model was poorly designed from the
> get-go), the query ran for 4 hours, and I was cut off as Oracle has
> setup the profile of every user to only allow a query to run for 4
> hours.  So I was wondering
>      a) Without creating indexes, is there a way I can do a plaintext
> file dump of a set of data from a table more efficiently?  If not,
> other than changing the user profile to allow unlimited query cpu/
> session resource, are there any other means?
>      b) Im anticipating that the data the analyst needs to investigate
> in will have to stored within a table of some sort for querying.
> Aside from creating indexes on that table, what other efficiency
> improver would you advice to make sure that this new table won't take
> forever to query?  Im thinking about denormalizing the table.
> Thanks,
> Victor

I agree with Joel Garry here. I obviously don't know the business case, but I am always suspicious of such requests. Surely the best place for the data is the RDBMS? Surely the best tool for manipulating the data is the RDBMS?

I was once asked by a manager if I could "dump" a database into MS Excel (*shudder*) for him. The database, while not huge, is 9Gb in size and consists of 400-odd tables. The politest way that I could tell him that he obviously doesn't know what he's talking about is to inform him that Excel is limited to 65535 rows :)

-g Received on Wed Oct 22 2008 - 06:11:03 CDT

Original text of this message