Re: Database migration options

From: Chen Zhou <>
Date: Wed, 21 Mar 2018 20:47:23 -0700
Message-ID: <>

Chris and Vishnu,
Thank you for your suggestions.
I think Datapump will take at least hours in our case, but it worths a try to get the timing.
As for partitioning source data to use TTS, I will look into that and may test it in our lab.
Thank you,

On Wed, Mar 21, 2018 at 7:49 PM, Vishnu <> wrote:

> Hi Chen, You can try transportable tablespace (TTS) for this. Partition
> your source data and move newer data to new tablespace and transfer it to
> destination. This method would be much faster than datapump export/import
> Thanks,
> Vishnu
> On Wed, Mar 21, 2018 at 10:00 PM, Chris Taylor <
>> wrote:
>> Datapump can subset data - you provide a query to DataPump to pull out
>> the data you want to export and import into the new database.
>> You could at least do a POC (Proof of Concept) of datapump to see how
>> what that can get you . You can use parallel on the datapump to speed
>> things up a lot.
>> You could combine Datapump subsetting and GoldenGate I think - subset
>> your data and use Golden Gate to sync up the SCNs (CSNs in GoldenGate) to
>> keep the new data up to date if I'm not terribly mistaken.
>> Chris
>> On Wed, Mar 21, 2018 at 7:50 PM, Chen Zhou <>
>> wrote:
>>> Hi,
>>> Our company needs to migrate some databases to new servers.
>>> However we don't want to migrate the whole databases, which have years
>>> of old data. Rather, we want to select some newer data from some tables
>>> and only carry these over to the new databases.
>>> Export and import will take too long due to the large sizes of our
>>> databases.
>>> Physical Standby won't work. Is logical standby an option here?
>>> I am also considering using GoldenGate.
>>> Are there other options?
>>> Thank you,
>>> Chen

Received on Thu Mar 22 2018 - 04:47:23 CET

Original text of this message