Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Usenet -> c.d.o.server -> Re: Streams advice

Re: Streams advice

From: Geoff Muldoon <>
Date: Wed, 22 Aug 2007 10:18:06 +1000
Message-ID: <>

DA Morgan says...
> Geoff Muldoon wrote:
> > Both source and destination databases are on the same RAC cluster,
> You need to explain this as it makes no sense. Replicating data within
> the same database? Why?

They're on the same cluster but in separate instances, and I cannot get any assurance that the OLTP and DW instances will long-term remain on the same RAC cluster. Some of the future OLTP systems to be added for DW mining purposes are almost certain to be on different hardware, and a common methodology is desired.

I am not allowed to place any of the DW ETL processes on the OLTP instances (stupid third-party vendor support restriction) and some of the ETL processes appear to seriously degrade when run over DBLinks eased opposed to within the instance.

The OLTP system is active 24x7 and if I use Streams I presume I can suspend the Apply Process (so I get a consistent snapshot) at the destination database while I run the ETL processes (about 2 hours) to minimise missing dimension data references in fact tables.

> > but we
> > will probably also want to additionally replicate to our dev copy of the
> > destination database on a separate (same version and O/S) RAC cluster.
> This makes sense but since it doesn't need to be real-time why not just
> clone with RMAN?

Simply to more exactly mirror the production environment capture process.

> > What appears to be our likely method is archive-log downstream capture (to
> > minimise business hours load on the OLTP instances) and using the
> > PREPARE_TABLE_INSTANTIATION options (as capture is required for only about
> > half the columns of about a fifth of the tables in a single source
> > schema).
> And here comes the fun.
> You are going from a RAC cluster which means each and every node has its
> own redo log thread which means you need to have streams running on each
> node.

I am aware that I will need to copy the archive logs (with supplemental logging enabled) from all source nodes to do downstream capture.

> Unless you have some need for the complexity of
> publish-transform-subscribe I'd take a serious look at using Change
> Data Capture. It is far easier to implement.

Asynchronous Change Data Capture is our only option, and I thought that it was actually just built on top of streams anyway? Is it just the case that the DBMS_CDC packages are easier to implement than the DBMS_STREAMS_ADM packages, or are there more significant differences? Looks like much more reading for me to do, it appears.

> Also make sure that you
> do the following:
> alter database force logging;
> alter database add supplemental log data;

Thanks, already onto that.

GM Received on Tue Aug 21 2007 - 19:18:06 CDT

Original text of this message