Very Large Database, dont know what my options are

From: <victor.s.email_at_gmail.com>
Date: Mon, 20 Oct 2008 17:57:54 -0700 (PDT)
Message-ID: <0399f6c8-ed0c-4e5d-b1fc-34278dad80e6@k37g2000hsf.googlegroups.com>


Hi,

     Im a newbie DBA and a little bit stuck with a problem with an Oracle Database and I would appreciate your help. I have a database of billions of records. I need to dump a set of data for investigation. I was using SELECT * OUTFILE... Because of the size of the DB and lack of indexes (data model was poorly designed from the get-go), the query ran for 4 hours, and I was cut off as Oracle has setup the profile of every user to only allow a query to run for 4 hours. So I was wondering

  1. Without creating indexes, is there a way I can do a plaintext file dump of a set of data from a table more efficiently? If not, other than changing the user profile to allow unlimited query cpu/ session resource, are there any other means?
  2. Im anticipating that the data the analyst needs to investigate in will have to stored within a table of some sort for querying. Aside from creating indexes on that table, what other efficiency improver would you advice to make sure that this new table won't take forever to query? Im thinking about denormalizing the table.

Thanks,
Victor Received on Mon Oct 20 2008 - 19:57:54 CDT

Original text of this message