Re: deductive databases

From: mountain man <hobbit_at_southern_seaweed.com.op>
Date: Sun, 15 May 2005 06:54:36 GMT
Message-ID: <MmChe.3363$E7.2565_at_news-server.bigpond.net.au>


"Kenneth Downs" <knode.wants.this_at_see.sigblock> wrote in message news:qbiil2-164.ln1_at_pluto.downsfam.net...
> mountain man wrote:
>
>> "Kenneth Downs" <knode.wants.this_at_see.sigblock> wrote in message
>> news:tuuel2-uph.ln1_at_pluto.downsfam.net...
>>> mountain man wrote:
>>>
>>>> "Alfredo Novoa" <alfredo_novoa_at_hotmail.com> wrote in message
>>>> news:87t881l00onh3tibjbqvokll34kn2d67s9_at_4ax.com...
>>>>
>>>>> To say that recursion is not useful to solve part explosion problems
>>>>> shows profound ignorance.
>>>>
>>>>
>>>> 1) WTF is this critical inventory explosion problem?
>>>
>>> A widget is made of components. Each component is made of other
>>> components,
>>> and so on and so on. The nesting level is not defined ahead of time and
>>> can go to arbitrary depths.
>>
>> Thanks for the brief.
>>
>> Yes, arbitary but not infinite depths, and once the maximum
>> depth is chartered, the current entire instance of the problem
>> can be flattened - no big deal. Auto-regen flat structure as
>> required.
>
> Two objections from experience.
>
> First, if you set the depth at N, somebody will need N+1. They will need
> it
> on Christmas Eve coinciding with your daughter's wedding.

OK. Set the depth at N + X, where N is the maximum depth and X is your comfort buffer.

> Second, it is dangerous to expect to pass complete run-outs when
> information
> changes because you lose the ability to keep transactions small. Multiply
> the run-outs by a high user count and you've got a major bottleneck.

I dont understand what you mean by run-outs. I suspect it has some form of relationship to baseball.

>> Throw the max depth code into an automated exception alert
>> to present on a queue to some workgroup that an instance has
>> arisen where widget_id 123456789 has exceeded max depth.
>
> See first objection above.

Throw max depth + N into the alert queue. (Where N relfects your index of comfort).

> Third objection is the dumb retail terminal. Counter help says, "I can't
> sell you that radio you are holding in your hand because the computer says
> we don't have any." The limit you set and the exceeding depth will be
> seen
> by the users as an arbitrary and onerous.

If the N+X limit was somehow exceeded, the item would be automatically sitting on a centralised data integrity exception queue.

It is probably important to mention here that IMO there are thousands of possible data integrity exceptions that will enter any production database system, no matter how well the data structures and constraints have been defined. Therefore an automated data integrity exception series is often the first thing constructed.

>>> Building a flat list of a complete parts explosion for an item therefore
>>> is
>>> a hassle.
>>
>> If you dont know the depth it is, but you build
>> a tool to determine the maximim depth first.
>>
>>
>> In fact there are literally hundreds of alternative work-
>> arounds to this type of problem without involving
>> any form of esoteric generalised recursion theory.
>>
>
> Agreed, you don't need anything esoteric, the solutions are well-known,
> it's
> just they all contain the moral equivalent of a divide by zero.

That's what the theoreticians say,
but I dont believe them. ;-)

>>>> 2) How many organisations are experiencing this problem?
>>>>
>>>
>>> All manufacturers in the world?
>>
>>
>> Not the ones I've ever had anything to do with.
>>
>> My main clients have been patent and trade mark attorneys
>> and their IP management systems invariably generate these
>> types of problems where relationships involve trees and
>> their branches. EG:
>>
>> Tracking the relationships between patent applications,
>> especially divisional patents, that can have parents, which
>> themselves are a divisional patent of another divisional, etc.
>> sounds like the same form of problem.
>
> Is it possible they are not hitting the system that hard? Not a high
> volume
> of transactions? Sounds like for lawyers it would be no problem to
> explode
> a hierarchy on each change, since they just don't change that often, no?

Correct, the inventory example would have a higher volume of transactions than the attorney example, but the principle should be applicable to either.

With very high volume of transactions the incremental "flattening" of new stock (since last "flattening") may need to be scheduled with a greater periodicity, or invoked per each new widget.

Pete Brown
Falls Creek
Oz
www.mountainman.com.au Received on Sun May 15 2005 - 08:54:36 CEST

Original text of this message