Energy Efficiency across Programming Languages (2017) [pdf]
I see similar postings coming by on LinkedIn now and then. Although there is some value in the benchmark results, the way these results are promoted is such a BS.
If your company's business is really in running some algorithms similar to binary search or the n-body problem, you should probably not be checking which language to choose that is most energy efficient, but which library or framework suits you the best. E.g you can perfectly well use Python + Pandas or NumPy even though everybody will agree that Python in itself is terribly inefficient. If you really want to go to the extreme in saving energy you should go all the way and code your algorithms in assembly. But it's obvious why this is probably not a very wise choice.
Everybody who knows a bit about IT will understand this, but the danger is that some project managers will read one of these papers and draw the conclusion that the team should switch to C or Rust because it is more sustainable ..
Please, for the love of God, can we ignore this useless paper? It has no new insights. It simply takes the Benchmarks Game and concludes that the ones that are fastest also consume least energy.
Benchmarks Game is a fun exercise to kill some time on, but it doesn't indicate anything about anything. The solutions across different languages don't even have the same algorithmic complexity. How can these possibly be compared?
The code from this study is here: https://github.com/greensoftwarelab/Energy-Languages
From looking at the code snippets, a big issue with this study becomes clear - it doesn't reflect how languages like Python are used in practice.
In practice, the "hot loops" of Python are in c/c++/fortran/cython/numba/... i.e. Python code usually makes use of a vast ecosystem of optimized science/maths/data science libraries. Whereas the study code is mainly using pure Python.
It's an issue with the methodology that the programs are written specifically for this study; creating an artificial situation.
It’s crazy how good Java is at energy efficiency. It is criminally underhyped and are sometimes replaced by the new best PL that will change everything, and yet it is very unlikely to have better results besides the inherent one in the rewrite.
Also, the often claimed negative of Java, memory overhead is relevant here — a GCd language operates best with a deliberate overhead over the strictly necessary memory. Java’s GCs are quite “lazy” in that they will not collect unused objects until it is deemed necessary, which is in line with energy efficiency.
Bitcoin Script[1] is probably the least efficient, then languages like Solidity[2] targeting Ethereum's EVM.
[1] https://en.bitcoin.it/wiki/Script
A colleague looked into improving energy efficiency of software as a academic project. Concluded that it's essentially the same problem as optimisation. Making the program/task finish sooner was by far the biggest contributor to energy saving, so the techniques for reducing energy consumption are those of program optimisation.
Related:
Energy Efficiency across Programming Languages [pdf] - https://news.ycombinator.com/item?id=24642134 - Sept 2020 (158 comments)
Energy Efficiency Across Programming Languages - https://news.ycombinator.com/item?id=21950341 - Jan 2020 (1 comment)
Energy Efficiency Across Programming Languages (2017) [pdf] - https://news.ycombinator.com/item?id=19618699 - April 2019 (1 comment)
Energy Efficiency Across Programming Languages - https://news.ycombinator.com/item?id=15249289 - Sept 2017 (139 comments)
Interesting, but I'd be more interested in the holistic, total energy cost of a particular language choice. Only where the program is highly compute-bound and is run for many parallel instance-hours, would an analysis like this help to determine the best options, like e.g. the language used for a widely-used spreadsheet program. In a more 'real life' case, the energy costs of developer-hours also needs to be amortised (their dev machines, their coffees & pizzas, the light and heating of their home/office, maybe travel to office etc.) which is much more conditioned by the productivity of the language for the problem domain. In many cases, the production energy use of a given language is miniscule in that totality, so only where it is a significant component should it be part of language selection criteria.
I think the answer for efficiency in programming is to reduce the total number of servers, data centers and bandwidth required to deliver the solution.
Picking a programming language has virtually no bearing on this. There are certainly some languages that can run faster in some scenarios, but these microbenchmarks have little relevance to any practical reality at scale.
For me, I like to go though hell instead of around it. Don't try to make one server sip the power obsessively. Make that one server do as much as possible, since you are already paying a fixed, idle power cost.
I think picking things like SQLite vs SQL Server have a much bigger impact on power consumption. These are less disruptive choices than swapping programming languages as well.
It seems like a legitimate problem that JavaScript, probably one of the languages with the most code executed worthwhile, is down toward the bottom.
I honestly wonder how much time and energy capacity could be freed up world wide if we just used tools optimized for hardware and network performance and efficiency.
For most businesses, a good chunk of their compute costs are energy costs, directly or indirectly (eg. via cooling, or via paying for cloud hosting from someone else who pays the energy bill).
Therefore, the real question becomes: Which language will reduce my compute budget most?
Profiling code optimisation in joules is a fun and somewhat unusual task. Surprisingly it has come up both for the high end in a supercomputing project and at the very low end embedded. Having an execution budget in handfuls of joules is just weird, and fun.
I've seen some comments on twitter (like this one [0]) that are saying that the complexity of programs in different languages is not the same and that is impacting results. Is someone willing to develop this for me (since I don't really know any other language here other than Python)?
Bonus: here's "Ranking Programming Languages by Energy Efficiency", from 2021 [1]
[0]: https://twitter.com/Czaki_PL/status/1569636020475265025
PHP looks horrific for energy use although it would have been 5.x in 2017. Think about all those PHP websites out there spending all day serving requests to bots and brute forcers. That's a lot of energy.
But be careful to not conclude that we ought to use more efficient languages to save resources.
Remember Jevons' Paradox: Efficiency leads to higher consumption, because that which is efficient will be used more, which in turn leads to higher consumption of the fuel by which its efficient use is measured.
Surely just short-circuit this by running your apps on servers powered by sustainability focused renewables energy sources.
JavaScript and TypeScript yield different results, there could be reasons like backwards compatibility transpilation but that's an option, otherwise code should be identical if written for performance.
undefined
Previous submission:
undefined
Note: this is from 2017.
undefined