Re: terminology

From: Tony D <tonyisyourpal_at_netscape.net>
Date: 19 Jun 2006 14:52:19 -0700
Message-ID: <1150753939.823764.84820_at_r2g2000cwb.googlegroups.com>


Marshall wrote:
> Yes, and seen in this light, doesn't the C "short-circuiting" &&
> and || look like weird choices? "Let's make everything strict
> except for these two operators." I suppose it was a terse
> way to hack in explicit conditional evaluation, but it's still
> weird.
>

Any form of short circuiting in a language that is effectively an abstraction of side effects is bad news. For example, if you have a piece of code like

   if (some_boolean_expression && some_boolean_function()) {...}

if you're not expecting it, weird stuff can happen if some_boolean_function has side effects other bits of the program rely on in some way. Sometimes they'll happen, sometimes they won't. I/O is a fairly obvious side effect, but the horror of global variables or aliased memory comes up too. It's just a plain bad idea. It's probably down to some clever hack with the PDP-11 instruction set someone worked out - after all, C is just a syntactically awful macro assembler pretending to have a type system. (If you get the idea I loathe C with a venegeance that would make you nervous - good !)

Although it could be worse; you could be programming in Ada, where the compiler is treated as a chunk of AI (it decides for you stuff like which parameter passing mechanism to use). A friend of mine had a function call completely disregarded by an optimising Ada compiler - she didn't believe me when I told her to turn down optimisation and it would work again. I won a nice bet out of that !

> Thing is, though, I prefer strictness. If I want laziness, I can
> get it explicitly; it's not the sort of thing I want by default.
> This is a design choice, though; YMMV.
>

Hmmmm ... look up the arguments on lazy evaluation before deciding; they used to be called things like "Should CONS evaluate its arguments ?", the gist being that if a function wants to handle a potentially infinte data set (the Fibonacci sequence, for example) should your program go into an infinite loop waiting for the data set to be described, or should you start calculating the function and work out as much of the potentially infinite data set as you actually need ? Or, if you have a potentially infinite data set and you want to cons it to a list, should cons go into an infinite loop ?

> Since Peyton-Jones is in the air, I'll reference "Wearing the
> Hair Shirt" a retrospective on Haskell.
>
> http://research.microsoft.com/~simonpj/papers/haskell-retrospective/
>
> P-J himself says
>
> "Laziness makes it much, much harder to reason about performance,
> especially space." He mentions other disadvantages too, along
> with the advantages.
>

Absolutely. Because you don't really know exactly what will be calculated before you start, in some cases. Abstract analysis (esp. strictness analysis) can help. Also, ISTR you can nominate functions as strict on their arguments in Haskell, so you can have default lazy and optional strict. (I may be wrong on that; once I got into the lazy programming mindset I just stopped worrying about strictness much !)

> I agree that Haskell is gaining momentum, but do you have
> any actual evidence that impure FPs are losing "market
> share" to pure ones? ML and OCaml seem to be doing
> quite well. Although, let's be honest, the strong majority
> of commercial software is written in C++ or Java, with
> VB a distant third.
>

Maybe not market share, but mind share I would think; I liked SPJ's comment about Haskell being "doomed to success" :) And then there's Erlang - if it's good enough for Ericsson's telephone exchanges ...

> Okay, I admit I haven't gotten around to reading "Tackling the Awkward
> Squad" even though people have been pushing me at it for years.
> But I dispute your use of the word "better" above. Monads have
> an advantage, and they have a cost; which way is the "right" way
> to go is a design choice.
>
> I'm a bit turned off by the fact that monads have to be threaded
> thoughout your program. It seems to me to be too high a price
> to pay. You lose modularity.
>
> Also: do you really need purity *everywhere* all the time or do
> you only need it where you need it, so to speak?
>

You really should read the "Awkward Squad" presentation; it would answer these questions for you straight off the bat (I must be honest and say I didn't really "get" monads myself until I started reading that, and some of the other papers listed with it on the Haskell site).

[ snippage ]

> But again: I'm still of the multi-paradigm mindset.
>

I'm not; but I suppose I'm just running out of patience and (shh!) getting crankier as I get older (ask Roy; he refers to me as "Mad Tony" on other NGs, especially when OO or XML rear their ugly, hydra-like heads).

> I believe the precise quote was:
> "Here is a language so far ahead of its time, that it was not only an
> improvement on its predecessors, but also on nearly all its
> successors."
> by Tony Hoare, speaking of Algol 60.
>

Thank you; I wasn't sure about that. (Although I think the sentiment could be applied to Pascal too... :)

[ snippage ]

> ... and for the courage of their conviction in completely
> eliminating loops.
>

I go further - no loops, and no variables, dammit ! (When I say variables here, I mean C/Pascal/Fortrash/etc. variables, obviously.) I honestly think that the two roots of programming evil are rubbish type systems and variables-in-the-3GL-sense. Think of the amount of effort that's gone into synchronising access to shared bits of memory, aka variables. Shocking. And it's irrelevant, to anyone who isn't an OS (or DBMS) builder !

> Still, any nontrivial K program looks like line noise.
>

Worse than Forth ??? (Truly a write-once-understand-never-again language !)

[ snippage ] Received on Mon Jun 19 2006 - 23:52:19 CEST

Original text of this message