RE: Database migration options
Date: Thu, 22 Mar 2018 16:34:10 +0000
Message-ID: <3E9FC3C66B6DD445A50671ECBA1F423E031FF6D18A_at_plt-exch-01.Itradenetwork.com>
Why GG? There are cheaper tools like HVR, Informatica and Attunity. If you already purchased GG and have a license, then go for it. Otherwise, I would not recommend it.
BA
From: oracle-l-bounce_at_freelists.org [mailto:oracle-l-bounce_at_freelists.org] On Behalf Of Chris Taylor
Sent: Thursday, March 22, 2018 9:07 AM
To: Chen Zhou
Cc: Vishnu; ORACLE-L
Subject: Re: Database migration options
Exactly. That gives you some lead time to get everything built and ready for cutover weekend without taking a long down time.
The GGate setup could be simple or complex - all depends on your environment. GGate is not simple so if you're already familiar with using it then it shouldn't be too much of a problem. But if you're new to GGate, then expect a somewhat steep learning curve.
Chris
On Thu, Mar 22, 2018 at 10:08 AM, Chen Zhou <oracle.unknowns_at_gmail.com<mailto:oracle.unknowns_at_gmail.com>> wrote:
Chris,
I was thinking the same thing, to do that I would set up GoldenGate beforehand, instantiate the tables by using Datapump export with select. Then import the data into the new database. Start GoldenGate to apply changes from that point on.
Is that what you are suggesting too?
Thank you,
Chen
On Thu, Mar 22, 2018 at 6:16 AM, Chris Taylor <christopherdtaylor1994_at_gmail.com<mailto:christopherdtaylor1994_at_gmail.com>> wrote: Let me explain my point here a bit better. (I don't know the size of your data of course)
You can do your datapump using parallel and if you have GoldenGate you use GoldenGate to keep the new data in sync for the tables you've identified. My whole point was about using datapump + Golden Gate. Then when you're ready to cutover to the new database the data will be there and doesn't have to be done on your go-live weekend or whatever. (since you mentioned that Golden Gate might be an option).
Chris
On Wed, Mar 21, 2018 at 10:47 PM, Chen Zhou <oracle.unknowns_at_gmail.com<mailto:oracle.unknowns_at_gmail.com>> wrote:
Chris and Vishnu,
Thank you for your suggestions.
I think Datapump will take at least hours in our case, but it worths a try to get the timing.
As for partitioning source data to use TTS, I will look into that and may test it in our lab.
Thank you,
Chen
On Wed, Mar 21, 2018 at 7:49 PM, Vishnu <vishnukumarmp_at_gmail.com<mailto:vishnukumarmp_at_gmail.com>> wrote: Hi Chen, You can try transportable tablespace (TTS) for this. Partition your source data and move newer data to new tablespace and transfer it to destination. This method would be much faster than datapump export/import
Thanks,
Vishnu
On Wed, Mar 21, 2018 at 10:00 PM, Chris Taylor <christopherdtaylor1994_at_gmail.com<mailto:christopherdtaylor1994_at_gmail.com>> wrote: Datapump can subset data - you provide a query to DataPump to pull out the data you want to export and import into the new database.
You could at least do a POC (Proof of Concept) of datapump to see how what that can get you . You can use parallel on the datapump to speed things up a lot.
You could combine Datapump subsetting and GoldenGate I think - subset your data and use Golden Gate to sync up the SCNs (CSNs in GoldenGate) to keep the new data up to date if I'm not terribly mistaken.
Chris
On Wed, Mar 21, 2018 at 7:50 PM, Chen Zhou <oracle.unknowns_at_gmail.com<mailto:oracle.unknowns_at_gmail.com>> wrote:
Hi,
Our company needs to migrate some databases to new servers.
However we don't want to migrate the whole databases, which have years of old data. Rather, we want to select some newer data from some tables and only carry these over to the new databases.
Export and import will take too long due to the large sizes of our databases.
Physical Standby won't work. Is logical standby an option here?
I am also considering using GoldenGate.
Are there other options?
Thank you,
Chen
-- http://www.freelists.org/webpage/oracle-lReceived on Thu Mar 22 2018 - 17:34:10 CET