Re: First Impressions on Using Alphora's Dataphor

From: Laconic2 <laconic2_at_comcast.net>
Date: Wed, 1 Sep 2004 10:11:50 -0400
Message-ID: <1s-dnfLB3sWARajcRVn-tA_at_comcast.com>


"pstnotpd" <schoenmakers_at_tpd.tno.nl> wrote in message news:ch4j9i$1rp$1_at_voyager.news.surf.net...

> My guess is that the use of pointers and casting basically defeats
> typing anyway. (I've had my share of debugging sessions where people got
> these wrong)

As far as casting goes, I disagree. At least in the dialects of Pascal that I've used, if you cast, you leave fingerprints in the source code. It's possible for someone who looks at your source to see what you've done, without benefit of a debugger.

I learned my first debugger in about 1962. Twenty years later, debuggers were becoming wildly popular, on machines like the VAX, but I was interested in going in the other direction. Nickle's worth might have put it thus:

Instead of learning the debugger, why not invest the same effort in really learning how to program?

I'm not sure I want to carry that big of a chip on my shoulder, but there is a certain appeal to getting it right the fist time. I know that when I tightened up my coding style, it was deliberate. I was sick of debugging my own code.

Finding bugs created by other people is a whole other story.

The issue of pointers is more complex, and I don't have time to go into it right now. I will say that I'm not entirely happy with the way Pascal addressed the issue, but an undisciplined approach to pointers leads to worse things. You get programs with memory leaks, or the CODASYL database model. Basically, pointers gone wild will do for data what GOTO does for process: it makes spaghetti out of it. Received on Wed Sep 01 2004 - 16:11:50 CEST

Original text of this message