RE: Database migration options

From: Bheemsen Aitha <>
Date: Thu, 22 Mar 2018 16:34:10 +0000
Message-ID: <>

Why GG? There are cheaper tools like HVR, Informatica and Attunity. If you already purchased GG and have a license, then go for it. Otherwise, I would not recommend it.


From: [] On Behalf Of Chris Taylor Sent: Thursday, March 22, 2018 9:07 AM
To: Chen Zhou
Cc: Vishnu; ORACLE-L
Subject: Re: Database migration options

Exactly. That gives you some lead time to get everything built and ready for cutover weekend without taking a long down time.

The GGate setup could be simple or complex - all depends on your environment. GGate is not simple so if you're already familiar with using it then it shouldn't be too much of a problem. But if you're new to GGate, then expect a somewhat steep learning curve.


On Thu, Mar 22, 2018 at 10:08 AM, Chen Zhou <<>> wrote: Chris,
I was thinking the same thing, to do that I would set up GoldenGate beforehand, instantiate the tables by using Datapump export with select. Then import the data into the new database. Start GoldenGate to apply changes from that point on. Is that what you are suggesting too?
Thank you,

On Thu, Mar 22, 2018 at 6:16 AM, Chris Taylor <<>> wrote: Let me explain my point here a bit better. (I don't know the size of your data of course)

You can do your datapump using parallel and if you have GoldenGate you use GoldenGate to keep the new data in sync for the tables you've identified. My whole point was about using datapump + Golden Gate. Then when you're ready to cutover to the new database the data will be there and doesn't have to be done on your go-live weekend or whatever. (since you mentioned that Golden Gate might be an option).


On Wed, Mar 21, 2018 at 10:47 PM, Chen Zhou <<>> wrote: Chris and Vishnu,
Thank you for your suggestions.
I think Datapump will take at least hours in our case, but it worths a try to get the timing. As for partitioning source data to use TTS, I will look into that and may test it in our lab. Thank you,

On Wed, Mar 21, 2018 at 7:49 PM, Vishnu <<>> wrote: Hi Chen, You can try transportable tablespace (TTS) for this. Partition your source data and move newer data to new tablespace and transfer it to destination. This method would be much faster than datapump export/import


On Wed, Mar 21, 2018 at 10:00 PM, Chris Taylor <<>> wrote: Datapump can subset data - you provide a query to DataPump to pull out the data you want to export and import into the new database.

You could at least do a POC (Proof of Concept) of datapump to see how what that can get you . You can use parallel on the datapump to speed things up a lot.

You could combine Datapump subsetting and GoldenGate I think - subset your data and use Golden Gate to sync up the SCNs (CSNs in GoldenGate) to keep the new data up to date if I'm not terribly mistaken.


On Wed, Mar 21, 2018 at 7:50 PM, Chen Zhou <<>> wrote: Hi,
Our company needs to migrate some databases to new servers. However we don't want to migrate the whole databases, which have years of old data. Rather, we want to select some newer data from some tables and only carry these over to the new databases. Export and import will take too long due to the large sizes of our databases. Physical Standby won't work. Is logical standby an option here? I am also considering using GoldenGate. Are there other options?
Thank you,

Received on Thu Mar 22 2018 - 17:34:10 CET

Original text of this message