It is still the dominant language for large-scale low-level systems. Critical software driving submarines, power plants, financial exchanges and medical devices is likely written in C++. The code on most of SpaceX's spacecrafts is in C++. Most of Google's infrastructure is written in C++.
With the constant increase of low powered computing devices we are likely to see that "niche" increase.
I think it is rather embarrassing that most code written in modern languages performs worse on today's computers than native code running on 1980s machines.
Performance is what makes computing magical.
"In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal and I believe the same viewpoint should prevail in software engineering" - Donald Knuth
Sadly, people only cite his other quote about premature optimization being evil, leading to a generation of computer programs written with no consideration of speed and user experience.
Funny how "critical" software often uses wildly unsafe languages. They should take advantage of paranoid type systems instead, that would prevent the occasional crash upon integer overflow, or SI/imperial incompatibility…
In established engineering disciplines, specification is a relatively small part of the work. In software, specification is everything (compilers and interpreters do the heavy duty for you). In established engineering discipline, a 12% improvement in a meaningful dimension (such as construction costs, or interior volume, or…) is easily worth doubling the specification effort. In software, a 12% cut in specification effort is easily worth a 50% worsening of another dimension, such as runtime speed or compilation speed (though of course we must not overdo it http://xkcd.com/303/ ).
This very much depends on what you care about. If you are running a 100,000 machine datacenter, a 12% efficiency improvement may save you millions of dollars in computing and energy costs a year. That is certainly worth a doubling of specification effort.
I don't think so. In established fields of engineering, optimization takes up as much, if not more time, than design.
Computing is a new enough field that so far that hasn't been the case, but I think with the increase of media processing and large scale data processing, a 10000 machine computer will be the new norm. And just like most other fields of engineering, optimization will be a major concern, if not the primary concern.
You are glossing over a fundamental difference between computing and other fields of engineering: copy & paste.
You don't get to just copy a bridge. You have to build another. With software, it is harder to justify building something anew, even if it's a bit different: you could re-purpose the old thing at very little cost (provided the old thing was well written and close to your mark to begin with, which is often not the case…).
Then there's the systematic automation of everything we understand sufficiently deeply. Garbage collection. Compiler optimizations. Libraries, some of which are freakishly fast. Databases.
When faced with a new project, you will increasingly not care about performance, because you will just reuse the incredibly fast infrastructure the system folks wrote for you. Don't get me wrong, performance is likely to become more and more important over time (as you said). It will just take less and less programming effort, as everything will increasingly be put in common —well, as long as we have the internet and Free Software.
More and more, the path to freakishly good performance will be simple, elegant, and obvious code. Algorithms will still matter, but you will hide most of them behind libraries. And you certainly won't do micro-optimizations.
If the compiler is not enough, you can have a team rewrite your hot spots using computer assisted semantic preserving transformations, in a tiny fraction of the effort it took you to write elegant code in the first place. Should you need to modify this code, no problem: just re-run the special set of optimization they devised in the first place. Worst case, the compiler just tells you it can no longer run the specialized optimization on your new code, leaving you with slow code until the optimization team amends its optimization.
Sure, but none of this points to high-performance languages being a shrinking niche. They will continue to dominate low-level code and library code.
Less low-level code will be written each year than high-level code, but it'll still be tremendously important. And as machines change, the underlying libraries will keep changing.
On the one hand, we have required effort. On the other hand, we have impact. To take an extreme example, Web browsers are a tiny niche in terms of development effort. Their impact however is something else entirely.
This is what I predict with low-level stuff. It will grow in terms of impact, but shrink in terms of development effort (at least in relative terms).
> as machines change, the underlying libraries will keep changing.
Increasingly, no they won't. What will change will be the optimization technique that we will need to operate on otherwise clean, elegant, obvious, and slow code.
Take a C library for instance. If it is written in a portable fashion, you only need to write a new C back-end to port that library to another machine. The same goes for semi-manually optimized code. You don't need to change the specification (I mean, the source code) to port the thing to another platform. You only need to change the optimization strategy. It's still an effort, but that's much less effort than a complete rewrite.
> It is still the dominant language for large-scale low-level systems.
He didn't say it wasn't the least dominant. He asked why it was to be considered the best. Dominance is a thing of legacy and you have to take into account a bunch of reasons that have nothing to do with how good a language is to assess why that's so.
Being the best tool is something completely different. I think the commenter has a good point and I think it's clear that a lot of the areas C++ has been used for have been taken over by better languages that offer more without conceding much. In that regard, he is spot on in saying that C++'s niche is indeed shrinking.
C++, like most other languages, is the best tool for some problems and areas, but those are growing smaller and smaller as we identify exactly what we need and create new languages to escape the rest of C++.
> I think it is rather embarrassing that most code written in modern languages performs worse on today's computers than native code running on 1980s machines.
Ok, let's play ball.
> After the Cray-1, the Cray Corporation developed another giant named Cray-2. It remained the world’s fastest Supercomputer between the years 1985 to 1989, capable of performing 1.9 gigaflops.
( Source: http://infotology.blogspot.be/2012/04/super-computer-timelin... )
So that's tha fastest of the fast in the '80's. According to Intel, a i7-3770k consumer CPU gets 112 gflops and a Q6600 quad core, which was released in Q1 2007, has 38.40 GFlops. That's a factor 58+ faster on the i7, or a factor 20 for the Q6600.
I picked the bench config x86 quad core because this had the highest peak. Also, if you take these languages, which are more commonly used:
* C#/Mono: Worst 9x, median 2x
* Go: Worst 7x, median 3x
* Haskell: Worst 3x, median 2x
* Java: Worst 3x, median 2x
* PHP: Worst 109x, median 36x
* Python 3: Worst 129x, median 37x
* Ruby: Worst 239x, median 53x
* JRuby: Worst 115x, median 34x
But keep in mind that:
* Current CPU integer performance, cache speeds, memory bandwidth and general i/o would completely destroy the Cray-2
* GFlops are a real bad indicator of how fast a computer is. Current computers are even faster. Current GPU's are floating point optimized and run circles around any general purpose CPU when it comes to GFlops.
* These are CPU-bound artificial language comparison benchmarks.
* Native code means nothing. Bad slow code will always be slow, whatever language you use. Some of the languages included do compile to 'native code' (Go and Haskell), and a lot of the interpreted/byte code compiled languages use a JIT to generate 'native code' too.
* We don't take into account the Cray-2's bottlenecks (mainly I/O)
* All GFlop numbers are according to the manufacturer.
* Most applications hit i/o limits before hitting CPU limits (yes there are exceptions)
* this is comparing with the fastest of the fast you could get at the end of the '80s
So in worst theoretical cases, sure. In the real world? Not even close.
> I think it is rather embarrassing that most code written in modern languages performs worse on today's computers than native code running on 1980s machines.
Are there any sources to support this statement, or are you drawing on personal experience?
I don't consider an improvement obtained switching from/avoiding a high level language that allows me useful abstractions and to avoid bookkeeping, to C++, an easy thing.
Yes, if the thing is just -slow-, optimize it. But never start out "We can't have this thing be slow; let's write it in C++!" unless you already have some metrics showing that it's -too- slow in a safer language.
It is still the dominant language for large-scale low-level systems. Critical software driving submarines, power plants, financial exchanges and medical devices is likely written in C++. The code on most of SpaceX's spacecrafts is in C++. Most of Google's infrastructure is written in C++.
With the constant increase of low powered computing devices we are likely to see that "niche" increase.
I think it is rather embarrassing that most code written in modern languages performs worse on today's computers than native code running on 1980s machines.
Performance is what makes computing magical.
"In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal and I believe the same viewpoint should prevail in software engineering" - Donald Knuth
Sadly, people only cite his other quote about premature optimization being evil, leading to a generation of computer programs written with no consideration of speed and user experience.