Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Re-organising a database?

Re: Re-organising a database?

From: Juergen Lueters <jlueters_at_intranet-engineering.de>
Date: Mon, 14 Apr 2003 14:53:21 +0200
Message-ID: <pan.2003.04.14.12.53.15.854234@intranet-engineering.de>


Am Mon, 14 Apr 2003 02:19:22 -0700 schrieb Tim Kearsley:

> Hi all,
>
> Environment is:
>
> Oracle 8.1.7 on AIX 4.3.3 on RS6000 SP hardware
>
> I've got a legacy database, about 30 Gbytes in size, which I would
> very much like to re-organise. Primarily I would like to implement
> LMTs and to put the datafiles into a structure which is fairly close
> to OFA.
>
> The obvious solution I suppose is to create the new database and
> export/import from the existing one. My problem is the time this
> appears to be taking. A trial took some two days by the time I had
> imported, re-created indices and solved a few problems with invalid
> packages. The database is pretty heavily used and it is going to be
> hard for me to sell a weekend's downtime to the users.
>
> Are there any other techniques I could use to reduce this downtime? I
> admit that I haven't tried tuning the export/import process. At
> present the export is done to a Unix pipe which compresses the dump
> file on the fly. Likewise the trial import was done by uncompressing
> the dump file on the fly. Is it likely to be substantially faster to
> export to multiple files rather than compressing (I am facing an OS
> filesize limit)?
>
> Any thoughts or ideas very welcome.
>
> Thank you.
>
> Tim Kearsley
> Database Manager
> Milton Keynes Council

Hi Tim
i will introduce a modified export/import scheme which is much faster. The idea is to split the import in two chunks. First import data, packages and create the referential inegrity items. Then (beeing online again) create inedexes in bulk/batch fashion. The procedure goes as follows:

0. Read "How to stop defragmenting and start living: The definitive word on fragmentation". You get that at otn.oracle.com

  1. Do a full database export, do not generate any statistics. This is dump no. 1
  2. Do a second full database export with ROWS=n. This generates a dump which describes the database structure in a textual manner.
  3. Create a new database
  4. import dump no. 1 with parameter indexfile=<indexfile>. This gets you a file with all index definitions and nothing else.
  5. Plan index creation in parallel and create scripts which are generating indexes from file <indexfile>. <indexfile> is full ascii. Create a master script which can start all this detail scripts in parallel. The goal is to keep all available processors busy.
  6. Import dump no. 1. _Do not create any indexes_ now. This shall be done in a relativly short time.
  7. You are in business again. Now you can go online with the database and allow traffic. Be prepared that performance is poor, as you do not have any indexes yet.
  8. Start your master script and generate the missing indexes in parallel.
  9. Generate the statistics to keep the optimizer happy.

The whole process executes much faster, but needs more planning and preparation.

Juergen Received on Mon Apr 14 2003 - 07:53:21 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US