It does stick out... but apparently too far as it blocked out the following qualification "(obviously different sectors age at different rates – web programming has a very fast cycle, embedded systems a very slow one)". C is a systems language, it's been around forever and is not about to be replaced anytime soon, possibly anytime in our lifetimes. It's too good at what it does. And that's why it's taught in a lot of universities.
As is Lisp, for the reasons you mentioned. As is SQL. As are several others (though I'd disagree with Java myself because it's really more set up to do things than to teach things and there are too many shortcuts in there - but many others disagree with me on that point :) ).
I'm not for getting rid of established languages from courses. C is 37 years old; C++, 30 years; Lisp, 51 years; Objective C, 23 years - these are well-established languages that won't vanish in the four-year span of an undergrad course. Java at only 14 years (and so many releases that stability is a valid question) tends (to my mind at least) to mark the start of the gray area there. Some though, like Perl and C++, are really hard to teach in compared to others and that can outweigh their stability as a factor. Still though, I look at languages like Python and would love to work them into a course. But it is still a risk for the students. Maybe for the final year work though.
But teaching a four-year course using Ruby as a primary language, intellectually interesting as that sounds, is an unethical act at the moment. It's a beautiful language, a real joy even to read - but what the students start learning on day one may not be around, or be a useful thing to know for the jobs they'll pay their mortgages with four years down the line. That's the responsibility the university is taking on when it creates a course. The student invests four years of their life - the university must produce a return on that, and conservative thinking is needed for that.
On one hand, we have talk of learning fundamentals, but on the other, if we teach with more exotic languages, "they'll never be able to find a job because they didn't learn anything useful."
So, which is it? Do schools teach the fundamentals, transferrable to any language, or do they teach what'll be useful in industry?
The two are orthogonal, at least in some aspects. The fundamentals can - technically - be taught in any language (practically, noone's going to try teaching a course in BrainFuck). But we try to choose a language that gives them, if not industry skills, a platform from which to reach those skills rapidly. For example, teach them C and an OO language of your choice properly - and they can learn C++, ObjectiveC, Java, Python or any of a dozen others very rapidly indeed compared to someone who dove in, learnt Inferno and then graduated into an industry that never heard of it.
The thing to remember is this - we teach them the fundamentals for a reason, namely to get jobs in industry. At the back of all this academic teaching is a commercial reality that can't be forgotten. But you have to balance that with a long-term view of the student's entire career. We're trying to give them a degree course, not a Sam's book!
That's not bitter, that's justifiable anger steve. There's a duty of care involved on the university's part - if they screwed the pooch, that's a pretty major thing.
This actually lets me responded in a threaded way, rather than your blog, apparently? So my bad, I'll just respond here.
Do you feel that your experience or mine is more mainstream? When I read Joel's "The Perils of Java Schools," I felt that it described my school to a tee. I had just assumed that that's how most programs are, given my school's size. Obviously, Ivy-leage schools should be better, but I just kind of assumed that my own experience was average.
Wordpress.com's threading leaves much to be desired I'm afraid.
I can't judge what would be mainstream in US accurately enough to rate your school I'm afraid. I'd see some of the details of the main schools as they write up their courses in journal articles; but my experience is mostly with Irish universities. However, from what several people here and back on the blog and over on reddit are saying, it sounds like the average Irish CS degree is a step or two ahead of at least some US schools. Which I have to say is a major surprise to me, since US schools were (I thought, from speaking to grad students and academics from them) better funded than Irish ones.
I'm not sure if it's a funding issue. It's more of a philosophical one. We've been telling kids for the last few decades that colelge is the path to success, and we've lowered standards so much that everyone goes to college, so that it's actually true.
College is quickly becoming a big commodity business, rather than a place to learn.
Look up INTERCAL. Brainfuck isn't too hard, just tedious. INTERCAL on the other hand was designed to be as big of a pain in the ass ass possible. People like to use the phrase "fighting with the compiler," but this is a language where the compiler is actually actively out to get you. : )
You're contrasting Ruby as an upstart and Java as an established language, but they're basically the same age. They both had their first public releases in 1995.
(Of course, this doesn't invalidate your point about relative popularity.)
And that's exactly the problem. In liberal arts educations (such as the one I'm receiving) or even at universities such as MIT, Python is the language of choice. It's great fun to teach, very easy to learn, but it's not an industry standard, and many would argue that it's too easy to use.
I don't think it's reasonable to claim Python isn't an industry standard. It's used heavily in industry by large and fairly conservative companies and is even more popular with startups.
I think it's reasonable because there are very rarely any such things as standards in software development. Municipal building codes are standards. Drug testing regimens are standards. Programming languages are not standards. At best they are conventions, and only insofar as certain niche subset industries are concerned.
As for de facto standards, I don't think we should be paying much attention to them. I think it's very important to draw the distinction that software development currently has no standards (you might be able to argue for TCP/IP, where alternatives are only ever used because they fulfill some unique use case not covered by TCP/IP), because for as much as we want to call this industry "software engineering", we sure as hell don't treat it like any other field of engineering, for many of the reasons already mentioned.
I think the first step is to sit down and agree on some terminology. You can't even get two programmers to agree on what "Object Oriented Programming" means. No wonder we aren't treated like professional engineers. We don't act like them.
Most "industry standards" are de facto standards. We're not talking about laws here, or even open standards like C or Common Lisp. An industry standard is simply a widely-accepted practice, which is perhaps defined by the fact that it would not usually be questioned by a casual observer from that industry. Painting interior walls some shade of off-white is an industry standard. Using Python for scripting and application logic is an industry standard in the same way.
I wouldn't say that using Python for scripting is even a de facto standard in that sense. Don't make the mistake of assuming your experiences are normative.
Really? Python seems to be showing up everywhere I look. I don't think I've seen any largish company in quite a while that wasn't using python for something somewhere.
It's spectacularly useful, and I personally really love using it - but to say it's an industry standard the way C or C++ are is to stretch the point a wee bit too far unfortunately.
Still though, we use some languages for teaching (like Pascal or Modula-2) which don't have the kind of industrial usage levels of C or C++, so we might see Python being taken up sooner rather than later. I think there are one or two courses already using it over here on a trial basis.
Python is "the real stuff", and I'm always annoyed by the snobbishness that leads some people to dismiss it. Python is an excellent language, with a large user base, and some compilers in the works.
Just because C is hardcore and Lisp is amazing and Haskell is mind-exploding doesn't mean that Python isn't a great language.
As is Lisp, for the reasons you mentioned. As is SQL. As are several others (though I'd disagree with Java myself because it's really more set up to do things than to teach things and there are too many shortcuts in there - but many others disagree with me on that point :) ).
I'm not for getting rid of established languages from courses. C is 37 years old; C++, 30 years; Lisp, 51 years; Objective C, 23 years - these are well-established languages that won't vanish in the four-year span of an undergrad course. Java at only 14 years (and so many releases that stability is a valid question) tends (to my mind at least) to mark the start of the gray area there. Some though, like Perl and C++, are really hard to teach in compared to others and that can outweigh their stability as a factor. Still though, I look at languages like Python and would love to work them into a course. But it is still a risk for the students. Maybe for the final year work though.
But teaching a four-year course using Ruby as a primary language, intellectually interesting as that sounds, is an unethical act at the moment. It's a beautiful language, a real joy even to read - but what the students start learning on day one may not be around, or be a useful thing to know for the jobs they'll pay their mortgages with four years down the line. That's the responsibility the university is taking on when it creates a course. The student invests four years of their life - the university must produce a return on that, and conservative thinking is needed for that.