Re: Decline of Science: Computer Science and Databases

From: Eric Junkermann <eric_at_deptj.demon.co.uk>
Date: 29 Nov 2002 02:11:20 -0800
Message-ID: <70568f73.0211290211.4e5ac1ab_at_posting.google.com>


"Carl Rosenberger" <carl_at_db4o.com> wrote in message news:<as5gf4$u0n$07$1_at_news.t-online.com>...  

> Only unthinking theoreticians produce generalisations like this one.
>
> Your generalisation certainly is false for usecases where
> - a network model is used to hold data
> - compatibility to SQL is not an issue
> - only a hierarchical access path is necessary
> - time to market is essential
> - the class model may be constantly refactored and reducing
> maintenance work to do so is a must
> - navigation solves all issues perfectly and queries aren't needed
>

So you are constantly refactoring, and you want to reduce maintenance work. But presumably your network model and your hierarchical access paths are tailored to your class model, and will also have to be refactored - how much work is this, how much data conversion do you have to do? How much data manipulation is needed to upgrade a live data store to the next version of the application (and how thoroughly can you test it)?

And, most importantly, how do you deal with a new requirement that does not fit neatly into your current access paths? Or to put it another way, how do you know that your rigidly defined navigation will solve all _future_ issues?

--
Eric Junkermann 
-- beware of hammer syndrome!
Received on Fri Nov 29 2002 - 11:11:20 CET

Original text of this message