Re: Golden Gate replication for a huge table
Date: Tue, 18 Oct 2016 20:35:16 -0400
Message-Id: <E6CF0187-A536-45A3-A276-9A1131635FD3_at_gmail.com>
GG can handle the workload you are expecting.
Recommend monitoring the performance and make adjustments as needed. Is this oracle-to-oracle or a heterogeneous setup? Classic config or integrated?
Bobby Curtis
http://about.me/dbasolved
Sent from my iPhone
> On Oct 18, 2016, at 6:32 PM, Kumar Madduri <ksmadduri_at_gmail.com> wrote:
>
> Hi
> We are in the initial phases of evaluating Golden Gate. One of our developers has a concern.
> She has a table that has 40 million rows that would be replicated to the target and after the initial load, the data volume changes on a daily basis would be in the 1000's (basically not in the range of millions of records but in 1000's or 10000's during year end processing for example).
> Her concern was will Golden Gate replication be able to handle this.
> From my understanding this should not be a problem.
> Wanted to get inputs from folks who have done this already (since my experience at this point of time is only working with small tables and reading documentation).
>
> Thanks for your time.
>
> Kumar
-- http://www.freelists.org/webpage/oracle-lReceived on Wed Oct 19 2016 - 02:35:16 CEST