Re: Golden Gate replication for a huge table

From: Bobby Curtis <curtisbl_at_gmail.com>
Date: Tue, 18 Oct 2016 20:57:07 -0400
Message-Id: <162003C0-94D0-42C9-BF73-97E1266FB7FF_at_gmail.com>


That is great. Integrated is the way to go. Simplifies the process and places all the monitoring into the database.

You should have any problem with the projected load you are looking at.

Bobby Curtis
http://about.me/dbasolved
Sent from my iPhone

> On Oct 18, 2016, at 8:49 PM, Kumar Madduri <ksmadduri_at_gmail.com> wrote:
> 
> It   is oracle to oracle and I am going to setup integrated extract and integrated replicat
> 
> Sent from my iPhone
> 
>> On Oct 18, 2016, at 5:35 PM, Bobby Curtis <curtisbl_at_gmail.com> wrote:
>> 
>> GG can handle the workload you are expecting. 
>> 
>> Recommend monitoring the performance and make adjustments as needed. Is this oracle-to-oracle or a heterogeneous setup?  Classic config or integrated?
>> 
>> 
>> Bobby Curtis
>> http://about.me/dbasolved
>> Sent from my iPhone
>> 

>>> On Oct 18, 2016, at 6:32 PM, Kumar Madduri <ksmadduri_at_gmail.com> wrote:
>>>
>>> Hi
>>> We are in the initial phases of evaluating Golden Gate. One of our developers has a concern.
>>> She has a table that has 40 million rows that would be replicated to the target and after the initial load, the data volume changes on a daily basis would be in the 1000's (basically not in the range of millions of records but in 1000's or 10000's during year end processing for example).
>>> Her concern was will Golden Gate replication be able to handle this.
>>> From my understanding this should not be a problem.
>>> Wanted to get inputs from folks who have done this already (since my experience at this point of time is only working with small tables and reading documentation).
>>>
>>> Thanks for your time.
>>>
>>> Kumar
--
http://www.freelists.org/webpage/oracle-l
Received on Wed Oct 19 2016 - 02:57:07 CEST

Original text of this message