Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Best Approach - remote system

Re: Best Approach - remote system

From: MPW <azr8rsfan_at_coxNOT.net>
Date: Tue, 07 Jan 2003 13:47:28 GMT
Message-ID: <QTAS9.5$hG5.10088@news1.west.cox.net>


January 7, 2003

John,

I've had success with remote processing from home office by creating outbound data files/inbound data loads and allow the archive logging process to support any issues occurring between receipt of next frequency data file. This process also tends to support constant data integrity checking, and if you choose to flag the (home) core database with a remote identifier - you can identify 1:M remote sites easily.

Generally speaking here is the scenario:

  1. Analyze data/data architecture needs at remote location (hw needed, Oracle tables/indexes/environment, thruput needed, application used, reporting needs)
  2. Create outbound data file based on business frequency needs (home office makes available via secure FTP connectivity)
  3. Remote site codes for FTP polling of file availability based on SLA (service level agreement)
  4. Remote site populates inbound file for processing before business period begins
  5. Any issues processing at remote site require re-importing data file plus archive logs from transaction processing (fairly easy and non-time consuming activity to perform, and occurs rarely)
  6. Based on timing and handshake activities with home office (frequency, volume, batch scheduling....), remote site returns data file for processing into core database (should be some % of core data)
  7. Home office batch processes remote data - return to step 2 for next period processing

I know it sounds simple....and does put the onus on the DBA/developers. I've found by the time you construct the overhead of real time networks, their coordination/maintenance, and all of the other nonsense....it was easier to produce a solution I could control at a MAJOR cost savings. To date I've not found any downside to producing quality distributed data from this process, except of course for the perceived lag between data file sharing (my process was daily) and SLA competency.

I've used a like process with version 8.1.7 on NT/W2K with a 20 gig daily file with success (data load time in minutes). Once the process is set up and humming along, it becomes just another scheduled activity in the mix.

Good luck! --- comments welcome
Michael

P.S. I'm available for Oracle based consultant work at your site for a reasonable cost. I keep things simple.....I communicate easily understood processes and produce quality project plans/followthru with downstream referencable documentation specific to your installation. Please contact me @ mailto:mwarren8_at_yahoo.com


> Hey Gang,
>
> Wondering if anyone has comments on the following approach:
>
> Scenario:
> We have an office 2000 miles away.
> We would like to share data (real time) with them.
>
> We will install a high speed internet line at the remote location.
> We will use VPN and use border manager software at the remote location to
> connect back here.
> We will create a client server application that resides at the remote
> location.
> We will setup COM+ here on a Win2000 server for our middle tier and
database
> connection.
> We will have the database on a server here.
>
> Anyone care to comment?
> Thanks,
> -John
>
>
Received on Tue Jan 07 2003 - 07:47:28 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US