Re: Very Large Database, dont know what my options are
Date: Mon, 20 Oct 2008 18:03:10 -0700 (PDT)
Message-ID: <5a14a225-973c-4f05-9680-bb92c315a4ca@k13g2000hse.googlegroups.com>
On Oct 20, 8:57 pm, "victor.s.em..._at_gmail.com"
<victor.s.em..._at_gmail.com> wrote:
> Hi,
>
> Im a newbie DBA and a little bit stuck with a problem with an
> Oracle Database and I would appreciate your help. I have a database
> of billions of records. I need to dump a set of data for
> investigation. I was using SELECT * OUTFILE... Because of the size of
> the DB and lack of indexes (data model was poorly designed from the
> get-go), the query ran for 4 hours, and I was cut off as Oracle has
> setup the profile of every user to only allow a query to run for 4
> hours. So I was wondering
>
> a) Without creating indexes, is there a way I can do a plaintext
> file dump of a set of data from a table more efficiently? If not,
> other than changing the user profile to allow unlimited query cpu/
> session resource, are there any other means?
>
> b) Im anticipating that the data the analyst needs to investigate
> in will have to stored within a table of some sort for querying.
> Aside from creating indexes on that table, what other efficiency
> improver would you advice to make sure that this new table won't take
> forever to query? Im thinking about denormalizing the table.
>
> Thanks,
> Victor
Hi Victor,
It sounds like you don't have DBA access to the database if you are getting knocked off after a period of time. I would suggest asking the DBA to provide some suggestions based on exactly what data you need.
I will say that if you are extracting the most of the data anyway, an index may not help.
Also, always provide version information (OS and database) and as much data as possible, as the insight you get will always be better.
HTH, Steve Received on Mon Oct 20 2008 - 20:03:10 CDT