Re: Golden Gate replication for a huge table

From: Kumar Madduri <ksmadduri_at_gmail.com>
Date: Tue, 18 Oct 2016 17:49:14 -0700
Message-Id: <93E841C8-BD15-4D3C-9C37-71AF3C74DFB1_at_gmail.com>


It is oracle to oracle and I am going to setup integrated extract and integrated replicat

Sent from my iPhone

> On Oct 18, 2016, at 5:35 PM, Bobby Curtis <curtisbl_at_gmail.com> wrote:
> 
> GG can handle the workload you are expecting. 
> 
> Recommend monitoring the performance and make adjustments as needed. Is this oracle-to-oracle or a heterogeneous setup?  Classic config or integrated?
> 
> 
> Bobby Curtis
> http://about.me/dbasolved
> Sent from my iPhone
> 

>> On Oct 18, 2016, at 6:32 PM, Kumar Madduri <ksmadduri_at_gmail.com> wrote:
>>
>> Hi
>> We are in the initial phases of evaluating Golden Gate. One of our developers has a concern.
>> She has a table that has 40 million rows that would be replicated to the target and after the initial load, the data volume changes on a daily basis would be in the 1000's (basically not in the range of millions of records but in 1000's or 10000's during year end processing for example).
>> Her concern was will Golden Gate replication be able to handle this.
>> From my understanding this should not be a problem.
>> Wanted to get inputs from folks who have done this already (since my experience at this point of time is only working with small tables and reading documentation).
>>
>> Thanks for your time.
>>
>> Kumar
--
http://www.freelists.org/webpage/oracle-l
Received on Wed Oct 19 2016 - 02:49:14 CEST

Original text of this message