Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Export / import of a db

Re: Export / import of a db

From: Mark Bole <makbo_at_pacbell.net>
Date: Tue, 31 May 2005 22:53:23 GMT
Message-ID: <DV5ne.790$IE7.613@newssvr21.news.prodigy.com>


DA Morgan wrote:

> DMina_at_laiki.com wrote:
>

>> The database size is approximately 58GB (used space) and the exported
>> dump file is 28GB

>
>
> I don't like any of the answers that quickly come to mind ... so here's
> what I'd do.
>
> 1. Create the exact same tablespaces that previously existed
> 2. Import into a schema created temporarily just for the import
> 3. Move the data to the desired storage locations in the final schema
> 4. Dump the temporary schema
> 5. Dump the extra tablespaces
>
> A pain but it solves the problem and protects you against many possible
> accidents.
>
> But if someone has a better idea, and I hope they do, I'd like to hear it.

Here's a modification of the above, which has some benefits in terms of minimizing overhead inside the database (the data only gets written inside the new database once, instead of twice).

Create a second export with ROWS=n, import it and move everything around as above (fast because all the tables have zero rows), and then import the first, data-laden export with IGNORE=y.

Or alternately, using the same ROWS=n export, run IMP with the INDEXFILE option. This will put out a text file with all your index and (commented) table creation statements. You then edit this file to use the new storage parameters (probably no harder than coming up with all the "ALTER [table|index] MOVE" statements you would need otherwise) and then run this file to pre-create all your tables (empty) and indexes in the correct location. Then import the data with IGNORE=y.

Since you asked what else to keep in mind:

  1. If you break your export up into several smaller pieces using the TABLES=(...) parameter, you can import each one simultaneously, which ought to give you at least some speed improvement if you have more than one CPU.
  2. you might as well turn off ARCHIVE LOG mode until the import is over, then turn it back on and take a backup (if appropriate).
  3. you should use non-default values for the BUFFER and COMMIT parameters when importing the data.
  4. using some variations on the scenario above (which I won't go into here) you can defer the creation of indexes, analysis of statistics, and the validation of constraints (time-consuming tasks) until some time of your choosing, after the table data is imported.

Breaking the task into smaller steps make it easier to test, easier to reproduce, and easier to re-start if interrupted.

-Mark Bole Received on Tue May 31 2005 - 17:53:23 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US