Re: Which Database?

From: Christopher Browne <cbbrowne_at_news.hex.net>
Date: 1998/08/27
Message-ID: <6s2ga9$qqr$20_at_blue.hex.net>#1/1


On 26 Aug 98 18:59:01 GMT, Tim Albion <tima_at_glentel.com> wrote:
>Jerry Gitomer <jgitomer_at_ictgroup.com> wrote in article
><6s0vbq$qul$1_at_client3.news.psi.net>...
>> Let me confuse the issue by suggesting that you have your stores
 access
>> one central database using a web server application.
>
>I must admit I have considered the idea, and then chucked it into the "Too
>much work"
>basket. Now that you've brought it up again I guess it does have a number
>of very
>nice features (how to stop staff browsing the Net all day might be a
>problem!).

Actually, that's less of a problem than you'd think.

Nothing's to say that there's actually going to be any connection to the "public Internet" anywhere in the process.

You're liable to have the following servers:

  1. Store-based Presentation servers - e.g. - the POS boxes that the staff use in the store. These run the web browser. They are connected, via Ethernet, to the "Store Firewall," and have been directed to use that server as a proxy server.
  2. Store-based Firewall Server. This has an Ethernet connection to the other machines in the store, and some form of connection to get to the central office.

The preferable communications method would involve some form of fixed link. If not much data needs to be serviced, this might only need to involve a telephone link to the central office. Or perhaps something faster such as ISDN or Frame Relay.

These servers would each have a proxy server running that might optionally filter out unwanted traffic; they would then forward all HTTP requests to the Central Office.

c) Central Office "Firewall" Server.

This server would handle requests from the Store-based "Firewalls," logging data requests, and perhaps passing on permissible requests to the Internet at large. Note that this means that any Internet use would be tracked in *one place.*

Note also that by using a cacheing proxy server such as Squid at both levels you may allow repeated requests to the same data to not require repeated downloading.

One might set it up to:
- Always deny requests to known-to-be-undesirable URLs. www.playboy.com, www.hustler.com, and the likes would all make decent candidates for this.

  • Deny requests to URLs outside the corporate network during regular working hours, permit requests after-hours.

Requests for the "web application" would be passed on to:

d) Central Office Web Application Server.

This might merely be a front end, with further servers (TP Monitors, RDBMS servers, application servers, ...) in behind to assist with the work.

-- 
"Surely if the world can't get any other benefit from the existence of
Microsoft, at least people should stop arguing that popularity has any
connection with merit!" -- Brian Harvey <bh_at_anarres.CS.Berkeley.EDU>
cbbrowne_at_hex.net- <http//www.hex.net/~cbbrowne/lsf.html>
Received on Thu Aug 27 1998 - 00:00:00 CEST

Original text of this message