RE: Data Migration options - oracle to oracle

From: Nabil Jamaleddin <>
Date: Mon, 9 Sep 2013 15:48:21 -0500
Message-ID: <006a01cead9d$ec4a5c10$c4df1430$>

Try a two-step approach.

First datapump tables that migrate easy over, ie no foreign keys or transformation.


I would use option #1 ie SQL Scripts INSERT as SELECT using DBLink and PL/SQL blocks for data transformation for the remaining tables.

Maybe you can write some self-generating sql code to code most of the work for you then tweak each insert/select stmt for the transformations.

Example write some code that will generate the insert code as below:

Select 'insert into '||table_name||' select * from '||table_name||'_at_Source_DB;' from user_tables;

Now the code above is very simple but it's just to give an example of generating code to help write your procedure. To make it work you will want to join it with user_tab_columns and list the column names.

There is also the copy command. Copy is a nice way to move data from one database to another without a db_link. But be careful with number fields if you have copy create the target table as it will strip out decimal points. Does not work with lobs but will move long data over. Can do simple transformations too.

Example if on target database

set copycommit 500 arraysize 500 long 9999999

Copy from nabil/pass_at_source_db insert emp using select * from emp;

help copy


 Copies data from a query to a table in the same or another  database. COPY supports CHAR, DATE, LONG, NUMBER and VARCHAR2.

 COPY {FROM database | TO database | FROM database TO database}

            {APPEND|CREATE|INSERT|REPLACE} destination_table
            [(column, column, column, ...)] USING query

 where database has the following syntax:

-----Original Message-----
From: [] On Behalf Of Karth Panchan
Sent: Thursday, September 05, 2013 11:14 PM To:
Subject: Re: Data Migration options - oracle to oracle

Thanks very much for response and details.

Here is little more background information.

Old application data reside in around 600 tables on each schema for our 1000+ customers.(1000+ schema)
New application need only data get migrated from 450 tables into Single Schema on target.
Data on tables range from 1 record to 1 Million, not more than that. We plan to migrate 100 customers per month, but we get only 24hours to migrate those 100 customers.

Both application running database on Oracle 11g. New(Target) running on RAC 2 Node in Linux 64 bit.

50% of table data to target database need transformation. Like 1 source table insert into multiple target tables (or) Multiple source tables to 1 target table.

Most of the target tables have Triggers enabled, that need to capture migration data for audit.

Given that we need to migrate every month after we go-live, there will be existing customers who will be using this application tables. So, we will not able to disable constraints/indexes/triggers on Target database.

If we had all tables partitioned by customer, this would have been easier to load staging table and exchange partition. Unfortunately, I was told that option is maintenance nightmare.

Currently, we have 2 solutions both using DBLink. 1) SQL Scripts INSERT as SELECT using DBLink and PL/SQL blocks for data transformation
2) DB package INSERT as SELECT using DBLink using Metadata table where SQL was defined.

    Package execute dynamically.

Both options are not much scalable based on expectation from production team.

Using Datapump, I need to find out how much transformation we can achieve.

Bottomline, we need to migrate only data from source to target db without impacting production system with performance to migrate 100 customers data(450 tables for each customer) in 24 hours.

Thanks very much for your points. I will look into that.

Appreciate any further pointers.


On Thu, Sep 5, 2013 at 7:12 AM, Norman Dunbar <>wrote:

> Hi Karth,
> On 05/09/13 11:53, Karth Panchan wrote:
> > Need your suggestion to migrate data from oracle to oracle db
> > instance
> for OLTP application.
> >
> > 1) This migration need 50% table data transformation from source db
> > to
> target db
> You can use datapump to export the data, indexes, constraints etc and
> exclude any already existing procedures, packages and functions in the
> target database - so yo avoid replacing ones that might be locked for use.
> On importing into the target database, you can specify a
> transformation procedure that will do the table data transformation that
you require.
> If you have an "air gap" between source and target, you'll need to
> create a dump file and move it between the two servers. On the other
> name, if you don;t have an air gap, you can create a database link in
> the target and simply do the import from there, without needing a dump
> file.
> > 2) Need to do migration with index/constraints/triggers enabled on
> Target OLTP instance during other users using this instance at
> production mode.
> When you import the data with impdp, use the content and
> table_exists_action parameters to specify that you only want the data
> (content=data_only) and on finding an existing table, append the data
> to it (tables_exists_action=append).
> > Recommended GoldenGate and Streams at my place. There is a lack of
> knowledge and no one want take risk.
> Likewise, I know nothing about these, yet, but I've (coincidentally)
> put some stuff on Datapump on my blog at
> I haven't got the transformation
> stuff up yet, so drop me an email if you need further info on this.
> (Via the list I suppose, in case anyone else is interested!)
> Cheers,
> Norm.
> --
> Norman Dunbar
> Dunbar IT Consultants Ltd
> Registered address:
> 27a Lidget Hill
> Pudsey
> West Yorkshire
> United Kingdom
> LS28 7LG
> Company Number: 05132767
> --



This email is intended solely for the use of the addressee and may
contain information that is confidential, proprietary, or both.
If you receive this email in error please immediately notify the
sender and delete the email.

Received on Mon Sep 09 2013 - 22:48:21 CEST

Original text of this message