Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Mailing Lists -> Oracle-L -> RE: A special way of migration

RE: A special way of migration

From: Andre van Winssen <>
Date: Wed, 28 Sep 2005 09:24:39 +0200
Message-ID: <008001c5c3fd$b30700d0$0b01a8c0@KAST01>


Here are some thoughts and experiences:

Exports can be made in a consistent way (consistent=y), meaning that all data is logically from the same timestamp, which is what you usually want.

Imports can be speeded up considerably by using large buffersize (and commit=y!) and by making sure no characterset translation has to take place between export server and import server. Import isn't happy importing datatypes LONG or LONG RAW, it will do it row by row instead of by arrays. Sqlplus' COPY feature is a workaround for the slow import using LONG because you can use arrays even with LONG's.

The issue with database links and sqlplus COPY is that you have to arrange yourself that copied data across tables in a schema is consistent, i.e. if you have to copy tables A..Z then no one else should be updating these tables until you are finished. Otherwise the data in the import or target server may loose its referential integrity.

Andre van Winssen

-----Oorspronkelijk bericht-----

Van: [] Namens
Verzonden: dinsdag 27 september 2005 16:10 Aan:
Onderwerp: A special way of migration

Hi list,

i have an unusal question i think, but i am curios about your meaning about this.

I am often in the situation to get a production database from an old db-server
to a new one. In some cases i am also instructed to lift up the Oracle Version. Naturally i use export/import to get all the data of our application
into the new database.

Sometimes the databases are quite big (about 20 to 200 Gig), so the Import and
some other stuff lasts long. The consequence is, of course, to do it at weekends or after buisness hours.

So far so good. There is no problem so far, but i was thinking about another
solution for this task.(maybe i am just lazy? ;-)

Here is my idea.

Write a package to get the application data via a database link into the new
database. The easiest way is "create table ... as select * from ..._at_db_link"

This is very fast, as my first tests have shown.

After that, i get all Indexes via some dynamic SQL, the same with all sorts of
constraints. At last i get all defaults for the table columns also via dynamic SQL

The advantage for me is, i can start one procedure, enjoy my weekend and just
have to look for problems after everything is done. It seems to be safe for the old production database, so, in worst case, we just have to cancel the installation of the new db-server, but the production database is still available.

What did i miss? Where are the pitfalls?

I started to write the package. It is working inside a test - scenario with a
small database.

Some things are still missing. Views are not implemented now, also Triggers are not ready yet. Packages of the application are no problem, because they are part of the installation of the application so i can install it after the
process is done. if they are compiling without errors, it is also a good sign
for the whole "migration" of the data because of the dependencies. Sequences are also not totally ready now.

So, what is your opinion?

Thx in advance



-- Received on Wed Sep 28 2005 - 02:26:53 CDT

Original text of this message