Re: Discuss Data Design
Date: 21 Jan 1995 01:57:46 GMT
Message-ID: <3fppmq$6at_at_news.csus.edu>
<<< FLAME BAIT >>>
This is not a technical response per se, just laying out a request for responses from the "grunge" oriented survival programmers or database designers. Is there any philosophical diversity out there?
The esteemed writer and technical expert, Steve Edelstein (74160.645_at_CompuServe.COM) wrote (the slightly anal retentive response):
: Don't de-normalize to cut down on join loads. It makes the
: database hard to maintain/enhance. Joins are optimized by using
^^^^^^^^^^^^^^^^^^^^^^^^
IMHO, they are always hard anyway! It is a question of the trade-
offs between performance gains using CORRECTLY denormalized design
and a few more maintenance hassels.
: indexes correctly. There should never be a problem with joins if
: they're done correctly! In fact, there's really NEVER a reason to
: de-normalize. It makes your life harder in the long term (altho
: it may seem easier short term). The idea of normalized data is
: central to relational databases. It separates them from the
: hierarchical ones, and means you can use the db and its tools as
: they were intended.
BARF! Lets try to live in the real world where typically underpowered and/or overloaded systems almost always require a CORRECTLY denormalized design. Maybe I'm still have a Oracle v4/v5 mentality, but EVERYBODY used to say that Oracle was an incredible system resource pig.
Please provide objective research data showing that joining more than three large tables can be done with decent performance. (yeah I know Oracle7 is a lot faster)
: I think its better to have one large table (with views for each
: year, for instance). This is also a matter of maintenance ease as
: well as processing efficiency. This may ba a personal preference,
: and there may be arguments about it (but NOT about
: normalization!).
Maybe my subversive tendencies are getting the better of me, but I think it is a disservice to present normalization theory/methods as an orthodoxy that shouldn't be voilated in a controlled structured manner by "aware" developers/designers.
On a more practical note, there was a good article 2-3 years ago in Database Programming and Design or one of those magazines by a woman that used to be an Oracle enployee about using structured denormalization methodology. She said that there are very few real world applications that will perform well without CORRECT denormalization.
The person starting this thread says very little about the complexity, structure or volume of the data in question, but the fact that it is STATIC makes it very different from a lot the the transaction oriented development that the experts do. There is also no mention of software life cycle and maintenance considerations.
Researchers or database support people in a research environment should be wary of the "normalization" orthodoxy that the OLTP and business system experts proclaim.
A method is a method is a method... it is all work.
:
: --
: Steve Edelstein VOICE: 212-956-3670
: Relational Business Systems CompuServe: 74160,645
: 124 West 60th Street Suite 47C Author of
: New York, NY 10023 "Learning Oracle Forms"
I congratulate Steve on a very well composed statement that is probably correct for many business applications, especially where hardware is powerful.
Hope you have/had a wonderful grungy denormalized weekend!
EP Received on Sat Jan 21 1995 - 02:57:46 CET