The first link is to a paper published by google with findings regarding the advantages of different languages from a technical stand point. I found the paper to be interesting enough to read entirely, but if you're looking for a tl;dr well this (http://www.computing.co.uk/ctg/news/2076322/-winner-google-language-tests) link does a pretty good job.
Aaaaand if that's still too much for you, here's a tl;dr for that:
C++ is the best in performance but requires the greatest amount of tuning, most of which would be impossible for the average programmer to do.
Java was the simplest to implement but also the most difficult to analyze for performance.
Scala was the best for optimization of code complexity due to "powerful language features."
Go, a relative infant compared to the above, pretty much struck out all around. It still has a long way to go before it's well optimized for binary size or performance.
Actually these tests are a little flawed, because they spent much more time and effort optimizing C++ stuff, than optimising the other benchmarks. I'm almost sure guys from Azul Systems (who wrote their own JVM performing better than the stock Oracle's JVM) could speed the benchmarks of Java and Scala by a large margin. Nevertheless, they wouldn't probably beat C++ until JVM gets full support for stack allocation of small objects and megamorphic virtual call inlining. Currently JVM supports only monomorphic and bimorphic inlining, which is also quite nice, but not enough for lambda-heavy languages like Scala.
Anyway, probably this is the first benchmark that got memory measurements right.
And Scala faster than Java is pretty surprising...
Nothing on C#?
or D?
As far as I know, Google doesn't use them, and doesn't plan to use them in the future. So why bother benchmarking them?
The internal debate, it appeared from the paper, was heated.
So, the optimizations these guys ran are the optimizations available from motivated google engineers.
In other words, the test tests real programmers/staff in a real company, with real tools available to them. Chances are, if you are an in a small company, or an amateur programmer like me, you will have no better resources available than the ones described in the paper.
This paper makes me happy that somewhat randomly, I picked C++ when I started the program I have been writing the last 3 years.
In other words, the test tests real programmers/staff in a real company, with real tools available to them.
True, but the problem they solved was artificial and quite small. There is really a huge difference on how far you can optimise a 100-line code snippet compared to a 100 000 line program. For the latter, there is usually not enough time and budget, except fixing some most obvious bottlenecks. There is also a risk of breaking things.
In a real world project, you aren't usually allowed to perform deep C++ optimisations e.g. custom memory allocators, hand crafted SSE, clever rewriting dynamic dispatch to templates, making things multithreaded etc. Without them most C++ code is of very similar performance to Java, or even slower, if you take into account virtual-dispatch- and parallelism-related optimisations, at which JVM beats almost every C++ compiler.