Re: Database migration options

From: Chen Zhou <>
Date: Thu, 22 Mar 2018 08:08:22 -0700
Message-ID: <>

I was thinking the same thing, to do that I would set up GoldenGate beforehand, instantiate the tables by using Datapump export with select. Then import the data into the new database. Start GoldenGate to apply changes from that point on.
Is that what you are suggesting too?
Thank you,

On Thu, Mar 22, 2018 at 6:16 AM, Chris Taylor <> wrote:

> Let me explain my point here a bit better. (I don't know the size of your
> data of course)
> You can do your datapump using parallel and if you have GoldenGate you use
> GoldenGate to keep the new data in sync for the tables you've identified.
> My whole point was about using datapump + Golden Gate. Then when you're
> ready to cutover to the new database the data will be there and doesn't
> have to be done on your go-live weekend or whatever. (since you mentioned
> that Golden Gate might be an option).
> Chris
> On Wed, Mar 21, 2018 at 10:47 PM, Chen Zhou <>
> wrote:
>> Chris and Vishnu,
>> Thank you for your suggestions.
>> I think Datapump will take at least hours in our case, but it worths a
>> try to get the timing.
>> As for partitioning source data to use TTS, I will look into that and may
>> test it in our lab.
>> Thank you,
>> Chen
>> On Wed, Mar 21, 2018 at 7:49 PM, Vishnu <> wrote:
>>> Hi Chen, You can try transportable tablespace (TTS) for this. Partition
>>> your source data and move newer data to new tablespace and transfer it to
>>> destination. This method would be much faster than datapump export/import
>>> Thanks,
>>> Vishnu
>>> On Wed, Mar 21, 2018 at 10:00 PM, Chris Taylor <
>>>> wrote:
>>>> Datapump can subset data - you provide a query to DataPump to pull out
>>>> the data you want to export and import into the new database.
>>>> You could at least do a POC (Proof of Concept) of datapump to see how
>>>> what that can get you . You can use parallel on the datapump to speed
>>>> things up a lot.
>>>> You could combine Datapump subsetting and GoldenGate I think - subset
>>>> your data and use Golden Gate to sync up the SCNs (CSNs in GoldenGate) to
>>>> keep the new data up to date if I'm not terribly mistaken.
>>>> Chris
>>>> On Wed, Mar 21, 2018 at 7:50 PM, Chen Zhou <>
>>>> wrote:
>>>>> Hi,
>>>>> Our company needs to migrate some databases to new servers.
>>>>> However we don't want to migrate the whole databases, which have years
>>>>> of old data. Rather, we want to select some newer data from some tables
>>>>> and only carry these over to the new databases.
>>>>> Export and import will take too long due to the large sizes of our
>>>>> databases.
>>>>> Physical Standby won't work. Is logical standby an option here?
>>>>> I am also considering using GoldenGate.
>>>>> Are there other options?
>>>>> Thank you,
>>>>> Chen

Received on Thu Mar 22 2018 - 16:08:22 CET

Original text of this message