I have always admired Dijkstra and feel sorry for myself for arriving to Austin too late. And this quote is something I wish I could have had a discussion with the man about:
"I feel that the effort to use machines to try to mimic human reasoning is both foolish and dangerous. It is foolish because if you look at human reasoning as is, it is pretty lousy; even the most trained mathematicians are amateur thinkers. Instead of trying to imitate what we are good at, I think it is much more fascinating to investigate what we are poor at. It is foolish to use machines to imitate human beings, while machines are very good at being machines, and that is precisely something that human beings are very poor at. Any successful AI project by its very nature would castrate the machine."
Lisp was great at the time of its inception if only for a radically new way of using machines. It has suffered the fate of having been elevated to a status of de facto standard with all its defects. Despite its conceptual novelty a lot of the design of Lisp is terribly amateurish even by the standards of the time. When I read the first manual, the Lisp 1.5 manual, published in 1961, I could not believe my eyes. It was an extremely poor language. Now it has become the defacto standard of the AI community, which now suffers from Lisp, the way the rest of the world suffered from Fortran.
yes, very much so. Just thought it was funny (it was how I read the headline at first). There was an amusing anecdote about a company not being allowed to start coding until 70% of the project time had elapsed - taken out of context (and there is a LOT of context) that is somewhat contrary to current practices.
It is not. Depends on what do you refer to by "current practices". Dumb CRUD projects surely can be coded on the fly using so-called "agile" approach without much thinking. "Steer left or right as required" works very well for them.
The thing is, most of online discussions, especially on YC are revolving around web-based CRUD to-do lists or something similar, whose complexity (something Dijkstra was very much concerned about) is close to zero.
Therefore "current pratices" cannot be used as a good example of software engineering in general, since most "engineers" are too busy dreaming of getting rich by implementing a high school programming project over a weekend and selling it to google.
I am fully aware that I do not represent the common "spirit" of this board by stating these views, but there is no way in hell you can do software for F-22's pilot's helmet can be done using "current practices" you're referring to. In fact, the best "current practice" (as some MBAs believe) is to outsource your typical CURD job to India. Hard to argue with that.
Yeah, that's why the "real" government projects are such outstanding successes. Because pour millions for over a decade only to find out at the end that your components don't integrate is a GREAT idea.
True, a lot of the applications developed today are crap. But a lot of the large scale applications can benefit the same way from test driven development, user stories and iterative development. The problem is a lot of folks overreact to agile and think it means doing no design or thinking at all. That's simply not the case.
This one is timeless: "I have said it before in public and I am perfectly willing to repeat it that someone introduced to computers via Basic is in all probability mentally mutilated beyond redemption. That is no joke. A major branch of the Siberian Academy of Arts and sciences is aimed at keeping Basic out of Siberian high schools."
I'm sorry but it's bullshit. I taught myself programming with BASIC when I was a kid, my first programs were horrible, unstructured things with GOTOs everywhere and redundant code and yet I was able to teach myself Scheme as well a few years later.
In fact, I find that I understand the point behind useful programming language constructs (and, yes, their beauty) after spending some time suffering due to the lack of them.
You have a lot to prove. A giant as big as Dijkstra was, he publicly said that "corentin has mentally mutilated beyond hope of regeneration" and you're out to prove him wrong.
Good luck. :-) Makes me feel more lucky for starting with x86 assembly although I am not sure it's any better than BASIC. God bless us!
I don't think Dijkstra's absolutisms have aged particularly well. On some things he turned out completely right, on other things staggeringly wrong. In this article, look at his endorsement of the idea that coding should be left to the final 25% of a project's schedule. It's all very hit and miss, yet he delivers every claim in the same godlike tone of infallibility. The idea that maybe everything hasn't been figured out about software development - that in fact, maybe not very much has been - seems alien to him.
Dijkstra could not predict that in 2007 to-do lists and message boards like Facebook will become multi-billion dollar businesses. Sorry to poop on your beliefs, but "agile" BS can be applied only on CRUD projects that have nothing in common with Computer Science.
Giants routinely get things wrong. It's called an argument from authority. The consequences of accepting arguments from authority because of their authority can be damaging and wasteful--read about Linus Pauling and Vitamin C, for example. So I would say he has no more to prove than otherwise.
It was just a conjecture of his that history didn't bear out. Thinking in basic is bad. Great leaps in programming and the availablity of knowlege about programming (other people's good code) made it easy to stop thinking in basic if one was motivated.
Great men once thought that the earth was flat and there were only 4 elements. It is trivial today for even a nitwit to prove otherwise. This does not mean that these great men are now less great or that nitwits are now smarter.
>> someone introduced to computers via Basic is in all probability mentally mutilated beyond redemption.
Elitism and snobbery at it's finest--like saying someone whose first bike had training wheels can never learn a ten speed.
BASIC is a gateway programming drug because in many incarnations, it can DO things with a short learning curve, which is rewarding. Many intro comp sci classes start with Waxing Off as opposed to creating something and building interest which then drives the lessons. There's no deeper understanding of WHY an approach is wrong than when you've been bitten by it's ugly consequences.
"Q You are speaking of the leading edge of research of computer languages.....
A Yes, What it is again in danger of being supported to death because one of the hopes of functional programming is that in the execution of programs written in a functional style it will be easier to exploit a lot of concurrency."
Interesting - well I guess that statement is over 20 years old, but probably still holds true. It also explains his other opinions.
"You should realize that in any field the time lag, the delay between a significant scientific progress and its acceptance by the scientific community at large not to mention the moment that it finds its way into a product should be measured in generations."
"I feel that the effort to use machines to try to mimic human reasoning is both foolish and dangerous. It is foolish because if you look at human reasoning as is, it is pretty lousy; even the most trained mathematicians are amateur thinkers. Instead of trying to imitate what we are good at, I think it is much more fascinating to investigate what we are poor at. It is foolish to use machines to imitate human beings, while machines are very good at being machines, and that is precisely something that human beings are very poor at. Any successful AI project by its very nature would castrate the machine."