Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> Re: Streams in 9i: controlling the moment when changes hit the target database
On 1 Mar, 18:41, "Ed Prochak" <edproc..._at_gmail.com> wrote:
> my first impression seems to be you are using the wrong tool for those
> customers. Looks like they might be better off with a file sent weekly
> applied thru sqlloader. What are the business requirements?
Hi Ed,
My prototype is really just a proof-of-concept as I've never used Streams before. The actual scenario is that there are two installations of my app on the same customer site, each running on its own 9i database. Among other things, the app allows the user to maintain a library of part numbers. The data relating to part numbers is held in 50 of the 500 tables in the schema. Changes to part numbers are always made in the first database only and so far they've been periodically copied across to the second database via a part number import/export routine in the app. However, it's reaching a stage where there are too many part numbers for that routine to handle (unless we rewrite it), and I've been told to look for alternative ways to keep the second DB in step with the first one. I want to use Streams to replicate those changes in the second database, where the part numbers are read-only.
Instantaneous replication will be fine for some end-user sites, but I also want to be able to buffer the changes; I imagine running a once-a- week job to copy all the part numbers and related info from the first DB to the second, and my question came because I imagine that simply starting and stopping the Streams apply process would be one way to do it. What do you reckon?
TIA
Tarby
Received on Fri Mar 02 2007 - 03:36:43 CST
![]() |
![]() |