Normalization Theory
Date: Mon, 2 Feb 2015 10:18:07 -0800 (PST)
Message-ID: <17212e15-f678-4fac-b862-b7caec276b75_at_googlegroups.com>
There are 2 sides of the issue. Is database normalization theory polished enough to be presentable to an average undergrad? My answer is no. However, is normalization topic a gem of database theory? Yes. Has it progressed since 1970s? Let's postpone this answer a little.
The fact that you are so upset by Normal form definitions indicates that you pay no attention to what happens underneath. I'm little uncomfortable with those voluntaristic definitions as well, but happily ignore them because there is something more fundamental going on. In case if you are wondering how normalization theory progressed since its early days:
- The first foundation that was explored quite exhaustively in last couple decades was information measure (entropy). Is entropy "not scientific" enough for you?
- Algebraic approach was also studied quite in depth. Remarkably, the major tool in algebraic approach is algebra of binary relations, and one of the main results is that relation with multivalued dependency is formally described by commutativity of relational composition.
To conclude, a typical database introductory course is quite odd. The first half studies relational algebra, but later almost nothing from that kit is utilized when teaching normalization theory. Received on Mon Feb 02 2015 - 19:18:07 CET