Re: Normalization by Composing, not just Decomposing
Date: Mon, 12 Apr 2004 16:03:24 -0400
Message-ID: <c5esln$suso$1_at_ID-114862.news.uni-berlin.de>
Haven't you ever heard of a joke (voodoo)?
"Eric Kaun" <ekaun_at_yahoo.com> wrote in message
news:Zczec.53528$aU6.24525_at_newssvr16.news.prodigy.com...
> "Alan" <alan_at_erols.com> wrote in message
> news:c54a0e$2ohurg$1_at_ID-114862.news.uni-berlin.de...
> > You are assuming that (good) normalization is a science. It is not. It
is
> > part science and part art- that's where experience (as well as ESP to
read
> > the user's minds and clairvoiance to predict future needs) comes in to
> play.
> > Oh, it is also part voodoo. Sometimes waving a dead chicken in a paper
bag
> > over your head produces the results you need.
>
> It might not be science, but it's at least a discipline based on logic
> (specifically functional dependencies). It's always going to require
> interpretation with respect to the domain being modeled, because we're
> trying to model part of reality, which is messy, in such a way that we
(and
> computers) can extract meaningful data, which requires clarity.
>
> That's all a far cry from voodoo, unless you're defining voodoo as
> everything which is not science. And you might be surprised what real
> science is like...
>
> > By the way, the process of
> > putting it back together is called denormalization,
>
> Putting it back together implies that information was lost during
> normalization, which isn't the case - in fact, the normalized schema
doesn't
> risk data loss (e.g. inconsistency) the way a denormalized schema does.
>
> - erk
>
>
Received on Mon Apr 12 2004 - 22:03:24 CEST