RE: Data Migration options - oracle to oracle

From: <rajendra.pande_at_ubs.com>
Date: Fri, 6 Sep 2013 07:49:51 -0400
Message-ID: <7E4D006EA3F0D445B62672082A16A565010C4FA5_at_NSTMC703PEX.ubsamericas.net>



If I understand correctly the issue is data transformation and the DB links.
Assuming that to be the case I would look into using online redef to put data into temp tables on the source database and then use exp/imp to move data on the day of
I suspect additionally with proper partitioning scheme you should be able to move the data in quickly
With online redef you can do the transformation on the source without incurring the overhead of db links

HTH Regards
-----Original Message-----
From: oracle-l-bounce_at_freelists.org
[mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Karth Panchan Sent: Friday, September 06, 2013 12:14 AM To: oracle_at_dunbar-it.co.uk
Cc: oracle-l_at_freelists.org
Subject: Re: Data Migration options - oracle to oracle

Norm/Raj
Thanks very much for response and details.

Here is little more background information.

Old application data reside in around 600 tables on each schema for our 1000+ customers.(1000+ schema)
New application need only data get migrated from 450 tables into Single Schema on target.
Data on tables range from 1 record to 1 Million, not more than that. We plan to migrate 100 customers per month, but we get only 24hours to migrate those 100 customers.

Both application running database on Oracle 11g. New(Target) running on RAC
2 Node in Linux 64 bit.

50% of table data to target database need transformation. Like 1 source table insert into multiple target tables (or) Multiple source tables to 1
target table.

Most of the target tables have Triggers enabled, that need to capture migration data for audit.

Given that we need to migrate every month after we go-live, there will be
existing customers who will be using this application tables. So, we will
not able to disable constraints/indexes/triggers on Target database.

If we had all tables partitioned by customer, this would have been easier
to load staging table and exchange partition. Unfortunately, I was told that option is maintenance nightmare.

Currently, we have 2 solutions both using DBLink. 1) SQL Scripts INSERT as SELECT using DBLink and PL/SQL blocks for data transformation
2) DB package INSERT as SELECT using DBLink using Metadata table where SQL
was defined.

    Package execute dynamically.

Both options are not much scalable based on expectation from production team.

Using Datapump, I need to find out how much transformation we can achieve.

Bottomline, we need to migrate only data from source to target db without
impacting production system with performance to migrate 100 customers data(450 tables for each customer) in 24 hours.

Thanks very much for your points. I will look into that.

Appreciate any further pointers.

Karth

On Thu, Sep 5, 2013 at 7:12 AM, Norman Dunbar <oracle_at_dunbar-it.co.uk>wrote:

> Hi Karth,
>
> On 05/09/13 11:53, Karth Panchan wrote:
>
> > Need your suggestion to migrate data from oracle to oracle db
instance
> for OLTP application.
> >
> > 1) This migration need 50% table data transformation from source db
to
> target db
>
> You can use datapump to export the data, indexes, constraints etc and
> exclude any already existing procedures, packages and functions in the
> target database - so yo avoid replacing ones that might be locked for
use.
>
> On importing into the target database, you can specify a
transformation
> procedure that will do the table data transformation that you require.
>
> If you have an "air gap" between source and target, you'll need to
> create a dump file and move it between the two servers. On the other
> name, if you don;t have an air gap, you can create a database link in
> the target and simply do the import from there, without needing a dump
> file.
>
> > 2) Need to do migration with index/constraints/triggers enabled on
> Target OLTP instance during other users using this instance at
production
> mode.
>
> When you import the data with impdp, use the content and
> table_exists_action parameters to specify that you only want the data
> (content=data_only) and on finding an existing table, append the data
to
> it (tables_exists_action=append).
>
> > Recommended GoldenGate and Streams at my place. There is a lack of
> knowledge and no one want take risk.
>
> Likewise, I know nothing about these, yet, but I've (coincidentally)
put
> some stuff on Datapump on my blog at
> http://qdosmsq.dunbar-it.co.uk/blog. I haven't got the transformation
> stuff up yet, so drop me an email if you need further info on this.
(Via
> the list I suppose, in case anyone else is interested!)
>
> HTH
>
> Cheers,
> Norm.
>
> --
> Norman Dunbar
> Dunbar IT Consultants Ltd
>
> Registered address:
> 27a Lidget Hill
> Pudsey
> West Yorkshire
> United Kingdom
> LS28 7LG
>
> Company Number: 05132767
> --
> http://www.freelists.org/webpage/oracle-l
>
>
>

--
http://www.freelists.org/webpage/oracle-l


Please visit our website at 
http://financialservicesinc.ubs.com/wealth/E-maildisclaimer.html 
for important disclosures and information about our e-mail 
policies. For your protection, please do not transmit orders 
or instructions by e-mail or include account numbers, Social 
Security numbers, credit card numbers, passwords, or other 
personal information.
--
http://www.freelists.org/webpage/oracle-l
Received on Fri Sep 06 2013 - 13:49:51 CEST

Original text of this message