Re: Concurrency in an RDB

From: David <davidbl_at_iinet.net.au>
Date: 23 Dec 2006 17:08:09 -0800
Message-ID: <1166922489.186511.312200_at_79g2000cws.googlegroups.com>


Marshall wrote:
> On Dec 22, 4:52 pm, "David" <davi..._at_iinet.net.au> wrote:
> > I've said many times in this thread that complex integrity constraints
> > can and should be enforced indirectly through separate
> > validation/verification queries. Here are some examples where that
> > approach is already used in today's applications
> >
> > 1. Source code compilation
> > 2. Software unit testing
> > 3. VLSI chip layout constraints
> > 4. Digital circuit verification
> > 5. Finite element modelling verification
> > 6. Geometric model verification
> >
> > Surely you agree these sorts of constraints shouldn't be enforced on
> > every update to the DB?
>
> I'm not sure if this is a trivial rephrasing of what you just
> said or an actual disagreement, but no, I would not agree
> that they "shouldn't" be enforced on every update.
> However I would agree that there may be practical limitations
> on so doing, such as the amount of time necessary to check
> the constraint. Any constraint that can practically be checked
> on every update should be.

You suggest that performance is the only issue at stake. In examples like the above, a verification failure often points to an error in *previously* committed changes. Software development is a good example of a "non-monotonic" process. Sometimes you need to commit a change that will temporarily break the integrity of the system.

Now you could argue that the user should be forced to make all the changes necessary for the DB to atomically change from one valid state to the next. However in some domains that could lead to long running transactions that take hours, days or even months to complete.

> > I have the impression (please correct me if I'm wrong) that your
> > assumption that a DB should always be in a valid state is coloured by
> > the (relational) problems that you have expertise in.
>
> Certainly this is always true for everyone. However I am having
> a hard time seeing the value of your approach given how much
> less it lets us count on the dbms.

Is it really a problem? A workflow can easily force a user to run the verification as part of the process of using the DB within a system.

As an example, good software companies have a release engineering workflow that ensures that the release has passed various unit tests, regression tests etc before it can be released. It goes without saying that it must compile successfully.

> I'm also unclear on how much
> I have to give up in the way of integrity enforcement. I'm having
> a hard time building a mental model for that. Your intent only to
> speak at a high level somewhat exacerbates this difficulty.
>
> Hmm, I just had an interesting idea. Perhaps the issues your
> idea raises could be dealt with as a "quality of service" issue.
> Where one needs strict durability, one could so specify externally
> to the application.
>
> This is a bit tricky because of the question of guarantees of
> desirable properties. One area I'm interested in is
> static analysis, and that's entirely dependent on building
> theorems from known properties of a system. Weakening
> those properties might render some analysis techniques
> unsound.

Examples of that would be relevant to this discussion.

Cheers,
David Received on Sun Dec 24 2006 - 02:08:09 CET

Original text of this message