Concurrent Software Development (was: Re: RAC or Large SMP...?)
Date: Mon, 13 Oct 2008 20:12:51 +0200
Sorry for wandering off topic here. The thread evolved into this direction and I found it interesting to follow up.
On 10.10.2008 14:31, Tim X wrote:
> Robert Klemme <shortcutter_at_googlemail.com> writes:
>> For a very long time the people implementing >> parallelized applications will have to do the analysis and design work that >> determines which parts can be done concurrently and how synchronization >> must take place. And from my experience this is the crucial part in >> getting scalability through concurrency into an application.
> Yes, I agree. I probably wsn't clear enough. [...] My amin
> point was that even those programmers that are able to think that way
> are somewhat constrained by current language and compiler technology
> that make it quite complex and difficult to express parallelism in a way
> that is concise and clear. Furthermore, due to the lack of adequately
> expressive constructs, being able to reacj a satisfactory level of
> confidence re: correctness etc is very difficult.
Ah, I see. Thanks for clarifying.
> I listened to a talk by 'Matz' (Ruby) who said that what he wanted to
> achieve with Ruby was a language that had a close fit with how he
> thought about problems. His arguement was that this allowed him to
> express his problems in a way that was clearer and faster because he
> didn't have to translate his mental model into a program model.
And this has worked out pretty well! I am using Ruby on a daily basis and it is always amazing how extremely productive you can be compared with other programming languages. Even thread handling is pretty straightforward. But discussions in the community show that many people have problems grasping "concurrency" - even with such an elegant language where you can focus on important things.
> to some
> extent, I think this applies with parallel programming. currently, those
> programmers that have the knowledge/skills to think about problems in a
> parallel fashion are constrained by the fact they have to then translate
> their mental model into a language that was designed primarily for
> sequencial problem solutions and which has cumbersome 'enhancements'
> designed to provide a safety net when operating in a parallel
> environment, but which often don't fit 'naturally' with the language.
> I'm talking about mainstream languages rather than specialised languages
> like Occum or other langauges that have been designed specifically for
> developing applications that can exploit parallelism.
I am not sure I agree here. You have to translate mental models to programming constructs all the time. When _reasoning_ about a problem solution you do not have to deal with the nifty details of synchronization. Rather you consider which parts of your calculations can be done in parallel etc.
> I've actually been surprised in the past by the degree to which most
> programmers have a 'sequencial' mindset.
Yes, this mindset seems prevalent.
> I think I must have been very very lucky. The second course I did in
> computing science was on parallel programming. It was back in the late
> 80s and I was lucky enough to have access to a simulated hypercube.
I did some parallel theory when I was at university because my university then was a center for parallel computing. After seeing all the constraints (i.e. you needed many thousands or even more processors to make some algorithms efficient because of the constants in O calculus) I decided to not further pursue parallel computing at the moment because I felt it was too immature - especially since you needed special network topologies (i.e. hardware) depending on algorithm. I was pretty sure that more "traditional" approaches (multiple processes, threads, IPC etc.) would for a long time dominate parallel computing. Nowadays nobody outside of universities seems to talk about hypercubes any more. Nevertheless synchronization and similar topics were covered otherwise and that knowledge has served me well.
> There is some interesting work I've seen passing references to that
> looks quite promising, especially in the area of providing safe access
> to resources in a predictable way. For example, I think Microsoft has
> been doing some work on a sophisticated optimistic memory model in which
> access is controlled by the udnerlying framework and which aims to
> eliminate the programmer from having to worry about such issues. The
> model is very similar to how oracle handles such issues and enables
> consistent views of the data. in fact, I think from memory that they
> said the model they were developing was based on how databases have
> addressed the need to provide consistent views of data in an environment
> with many clients. It is technologies like this that will enable
> programmers to devleop parallel programs that don't become so complex
> that either they never get out of the bug squashing stage or when they
> do, they die due to maintenance problems.
This sounds interesting. Thanks for sharing!
> Of course, all of this may become completely mute when quantum computing
> comes into play and all our software is in every possible state at every
> possible moment. Now that will be a paradigm that is likely to melt a
> few programmer minds that have grown up in the von neumann world!
> Luckily, I expect to be retired by then.
:-) Reminds me somehow of the infinite improbability drive...
robert Received on Mon Oct 13 2008 - 13:12:51 CDT