Re: Programming is the Engineering Discipline of the Science that is Mathematics

From: vc <boston103_at_hotmail.com>
Date: 9 Jun 2006 18:04:47 -0700
Message-ID: <1149901487.382879.226140_at_m38g2000cwc.googlegroups.com>


Keith H Duggar wrote:
> vc wrote:
> > Erwin wrote:
> > > > This is why PT is a /generalization/ of logic. It
> > > > reduces to logic when applied to truth-valued
> > > > statements. Just as gamma reduces to factorial for
> > > > natural arguments. (Again no quibbles about offset by
> > > > 1 etc).
> > > >
> > >
> > > You mean like :
> > >
> > > AND (p1, p2) === p1*p2
> >
> > P(p1 and p2) is not equal P(p1)*P(p1) in general, so no
> > such 'generalization' is possible.
>
> Wow! vc is going off the VI deep end at the moment. "no such
> 'generalization' is possible"? Saying that PT is not a
> generalization is one thing; but, none possible??
>

Recall that the OP claimed that PT is a logic 'generalization' in the sense that 'probability depends only on the constituent probabilities'.  He failed to prove the bizzare assertion and refused/or was unable to solve the trivial puzzle that disproves his statement.

However, below, instead of talking about such 'generalization, he attempts to demonstrate 'reducing' probabilistic statements to their logical truth valued counterparts by conjuring up, in vain, the spirit of conditional probability:

> Erwin, what vc was referring to is that
>
> P(AB) = P(A|B)P(B) -or-
> P(AB) = P(B|A)P(A)
>

Note, that I said nothing about conditional probabilities. I merely requested to compute the P(A and B) probability in terms of P(A) and P(B) which was promised by the OP (see above).

> where | means given and AB is short for "A and B". This is
> called the product rule. Something that vc seems not to know
> (given his questions in the other post) is that in the limit
> of true (0) and false (1) the conditional probability
> product rule reduces to the logical conjunction truth
> table. Here is the proof
>
> g : P(A) = 0
> p : P(AB) = P(B|A)P(A)
> u : P(AB) = 0

Unfortunately, it's no proof but just mindless playing with formulas. The conditional probability is *defined* as

P(B|A) def P(A and B)/P(A)

the requirement for such definition being that P(A) <>0, naturally. The definition can be found in any introductory PT textbook.

Even, if P(A) were <> 0, the step 'p' is invalid since P(B|A) is unknown, only P(A) and P(B) are given and P(A and B) has to be computed. It's, like, secondary school algebra.

>
> g : P(B) = 0
> p : P(AB) = P(A|B)P(B)
> u : P(AB) = 0
>

See above.

> g : P(A) = 1
> g : P(B) = 1
> s : P(~B) = 0
> m : P(A) = P(AB) + P(A~B)
> p : P(A) = P(AB) + P(A|~B)P(~B)
> u : P(A) = P(AB)
> c : P(AB) = P(A)
> u : P(AB) = 1

This is even funnier. First, we do not know what P(A|B) is (see above) and second the question is what kind of events might A and B be if the probability of either is one ? What about P(A or B) given the respective probabilities are one ? Is it two by any chance ? (I asked the same question in another message).

>
> thus
>
> P(A) : P(B) : P(AB)
> 0 : 0 : 0
> 0 : 1 : 0
> 1 : 0 : 0
> 1 : 1 : 1
>

Unfortunately, there can be no 'thus'.

> descriptions
> g : given
> p : product rule
> s : sum rule
> m : marginalization (derived from sum rule)
> u : substitution
>
> -- Keith --
Received on Sat Jun 10 2006 - 03:04:47 CEST

Original text of this message