Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem with the concept of "the singularity" it is has a hidden assumption that computation has no relationship to energy. Which, once unmasked, is a pretty outlandish claim.

There is a popular illusion that somehow technological progress is a pure function of human ingenuity, and that the more efficient we can make technology the faster we can make even better technological improvement. But history of technology has always been the history of energy usage.

Prior to the emergence of homo-sapiens, "humans" learned to cook food by releasing energy stored in wood. Cooking food is often considered a prerequisite for the development of the massive, energy consuming, brain of homo-sapiens.

After that it took hundreds of thousands of years for Earth's climate to become stable enough to make agriculture feasible. We see almost no technological progress until we start harvesting enormous amounts of solar energy through farming. Not long after this we see the development of mathematics and writing since humans now had surplus energy and they could spend some of it on other things.

You can follow this pattern though the development and extraction of coal, oil etc. You can look at the advancement of technology in the last 100 years alongside our use of fossil fuels and expansion of energy capabilities with renewables (which historically only been used to supplement, not replace non-renewables).

But technological progress has always been a function of energy, and more specifically, going back to cooking food, computational/cognitive ability similarly demands increasingly high energy consumption.

All evidence seems to suggest that we increasingly need more energy for incrementally smaller return on computation.

So for something like the singularity to happen, we would also need incredible changes in available energy (there's also a more nuanced argument that you also need smooth energy gradients but that's more discussion than necessary). Computation is not going to rapidly expand without also requiring tremendously large increases in energy.

Further it's entirely reasonable that there is some practical limit to just how "smart" a thing can be based on the energy requirements to get there. That is, you can't reasonably harvest enough energy to create intelligence on the level we imagine (the same way there is a limit to how tall a mountain can be on earth due to gravity).

Like most mystical thinking, ignoring what we know about thermodynamics tends to be a fundamental axiom.



There are hard limits for how much energy we can provide to computation, but we are not even close to what we can do in a non-suicidal way. In addition to expanding renewables, we could also expand nuclear and start building Thorium reactors - this alone ensures at least an extra order of magnitude in capacity compared to Uranium.

As for the compute side, we are running inference on GPUs which are designed for training. There are enormous inefficiencies in data movement in these platforms.

If we play our cards right we might have autonomous robots mining lunar resources and building more autonomous robots so they can mine even more. If we manage to bootstrap a space industry on the Moon with primarily autonomous operations and full ISRU, we are on our way to build space datacenters that might actually be economically viable.

There is a lot of stuff that needs to happen before we have a Dyson ring or a Matrioska brain around the Sun, but we don’t need to break any laws of physics for that.


> we are not even close to what we can do in a non-suicidal way.

I'm honestly not sure how anyone can be remotely aware of the other major consequence of our energy consumption, climate change, and make this statement.

We're far more likely to be extinct in 100 years than see any of your sci-fi proposals come to fruition.

But I guess that gets to the point of all of this discussion: belief in a technological singularity is just a different flavor of the religious tools we have used to avert existential dread for thousands of years. You need to believe these things are true so there's nothing that can ever be said to convince you otherwise.


There are several books which explore this concept, viewing history through the lens of energy systems available to and utilised by humans.

Vaclav Smil has written two of these, Energy and Civilization (2017) and Energy in World History (1994). They cover much the same ground, though with different emphases.

<https://vaclavsmil.com/book/energy-and-civilization-a-histor...>

<https://vaclavsmil.com/book/energy-in-world-history/>

Manfred Weissenbacher's Sources of Power (2009) more specifically addresses political and military implications of different power systems.

<https://www.bloomsbury.com/us/sources-of-power-9780313356261...>

In the past year there's a new book on the topic, Energy's History: Toward a Global Canon, by Daniela Russ and Thomas Turnbull, though I've yet to read it.

<https://www.sup.org/books/politics/energys-history>

There's a review here: <https://networks.h-net.org/group/reviews/20131545/priest-rus...>.


Don’t forget the practical ability to dissipate waste heat on top of producing energy. That’s an upper limit to all energy use unless we decide boiling ourselves is fine, or find a way to successfully ignore thermodynamics, as you say.


If we'd ever get so far that would be the most compelling argument for datacenters in space


Heat rejection is far more challenging in vacuum.


We can always build a sunshade in the Earth-Sun L1. Make it a Sun facing PV panel pointing radiators away from us and we can power a lot of compute there (useful life might be limited, but compute modules can be recycled and replaced, and nothing needs to be launched from Earth in this case).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: