Re: Normalization by Composing, not just Decomposing
Date: Thu, 08 Apr 2004 20:26:50 GMT
Dawn M. Wolthuis wrote:
> If I am understanding correctly, the process of normalization for any set of
> data attributes is a process of decomposing from one large set to several
> smaller ones. That makes sense when starting from scratch.
> But tests for determining whether data is normalized also seem to focus on
> whether it has been fragmented sufficiently and do not take into account
> whether the data has been TOO fragmented.
Of course they take that into account! Except for the obvious requirement to be information preserving, you might have a requirement to be dependency preserving. Even beyond that the theory doesn't tell you when you should and should not split. It only tells you in what stage you are, what the potential problems in that stage are and how you might get rid of them. Whether you want to accept these potential problems is up to you as a database designer. But without a thorough understanding of normalization theory you cannot make a well-founded decision about this.
By the way, if you think normalization for the 1NF relational model is tricky, then I seriously doubt you will master the art for the NFNF relational model. Never mind a semistructured datamodel like XML or the Pick data model where things get much much more complicated.
> In other words, there ought to be some "rules" that govern when we ought not
> split out data attributes, in general, as well as when we should decompose
That's what the rules already "do" now. If you want to be in 3NF / BCNF / PJNF / DKNF or whatever, and you can join two tables without breaking the NF requirements, then there is nothing in normalization theory that tells you to keep them separate. In that respect they are much like physical laws: they don't tell you what to do, just what the consequences of your choices will be.
- Jan Hidders