Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The pace of major breakthroughs has declined (ucsd.edu)
109 points by huac on Oct 11, 2015 | hide | past | favorite | 127 comments


> "Should we succeed at controlled, sustained, net-positive fusion, we would qualify it as a new face at the table. I might characterize it as the most expensive way to create electricity ever devised (and electricity is not the hard nut to crack)."

It's not even been developed, yet the author is convinced it will be 'the most expensive way to create electricity ever devised'.

The article is riddled with statements like that, that twist history to serve the author's narrative. In the list of developments unfamiliar to the person from the 19th century we have the toaster and the blender, listed as if these would be modern marvels. In comparison, the push into space over the last 65 years is downplayed, as if the fact we aren't all taking space holidays is some indication that the field has been stagnant. Whilst it's a shame there haven't been any manned missions to moons and planets since the 70s, there's been a ton of development in the field... satellites, telescopes, space probes, space stations, etc...

I'd also suggest that a lot of the developments in technology over the past 65 years have been too small to see. Semiconductor manufacturing, bio sciences, etc... Does it really matter if developments aren't immediately obvious to the eye?


If he's only looking at ITER he might have a point. He's ignoring much cheaper possibilities, ranging from conservative but more compact approaches like MIT's ARC design and UW's dynomak, to aneutronic designs like Tri-Alpha and Helion.

UW claims the dynomak would cost less than coal: http://spectrum.ieee.org/energy/nuclear/inside-the-dynomak-a...

Aneutronic fusion has the potential to be far cheaper than any other energy source.


I made some calculations.

The total cost of ITER is $14 billons (and going up :) ). The total production of electricity in the word is 20000TWh/year.

If we magically switch to 100% tokamak fusion production, and we want to recoup the inversion in one year, it would add 0.7 $/MWh to the production (and distribution cost).

A more realistic scenario is to generate only the mythical 1% using tokamak fusion reactors, and recoup the money in 10 years. With this modification the R&D cost is 7 $/MWh.

Just for comparison, the current cost of electricity is 50-150 $/MWh. So 0.7-7 $/MWh is not too much.

Anyway, the ITER project is not even close to make a functional power generator, so the final cost may be much bigger than $14 billons.


Sure but it's not mainly the R&D, it's the capital cost of such large, complex reactors. MIT's ARC, to take the most conservative alternative, is also a tokamak, but with much stronger superconductors it can be built a lot smaller and cheaper.

MIT has a lot of experience in this area since they already have a tokamak with the strongest magnetic fields of any in the world. They've made at least one fundamental breakthrough with it, not too long ago. Of course Congress is about to cancel their funding.


Lockheed will probably lap ITER within the next few years: https://en.wikipedia.org/wiki/High_beta_fusion_reactor

Let's not forget that the end-game for fusion is "virtually limitless, safe, clean, carbon-neutral, very high-density power"--a set of characteristics literally none of our other sources of electricity have.

The author also neglects all of the scientific advancements that we've made but haven't yet transmuted into engineering advancements. Graphene, nanotubes, metamaterials, etc.


LENR, too, which the US Navy and NASA seem to be considering:

http://www.lenr-forum.com/forum/index.php/Attachment/386-IEE...


LENR has never recovered from the shellacking Koonin and Lewis gave it.


It just needs a change of the name or the initials and it's good to go..


> It's not even been developed, yet the author is convinced it will be 'the most expensive way to create electricity ever devised'.

Sounds to me like the author is right in a sense. A lot of money has been going into fusion for many, many years now. If you look at the bottom line you're going to need to be generating a LOT of electricity from fusion before you're ever going to break even.

I'm not saying that this is the right way of looking at it, financial gains aren't everything, after all. But fusion IS very expensive electricity.


> A lot of money has been going into fusion for many, many years now.

Maybe the ~$29 billion [1] spent by the US on fusion total since 1953 to date seems like a lot from a personal perspective, but consider the modern cost of $3.5 billion for building 1 Giga Watt of generation capacity [2] for a coal power plant, the claim that fusion is unreasonably expensive and impracticable doesn't hold water. Compare fusion research to other broad spectrum research projects that benefit broad swaths of mankind, you see a remarkably different attitude and perception. This one cancer research foundation in the UK raised 460 million pounds in 2012/13 [3], more than twice the total international spending on ITER in 2014 [4]. There is clearly a double standard when it comes to comparing the results of fusion R&D with other long term efforts that still fail to offer satisfactory solutions after decades of concerted effort.

[1] http://focusfusion.org/index.php/site/reframe/wasteful/ [2] http://schlissel-technical.com/docs/reports_35.pdf [3] http://www.cancerresearchuk.org/prod_consump/groups/cr_commo... [4] (page 20) https://www.iter.org/doc/all/content/com/Lists/depts/Attachm...


That's completely irrelevant to efficiency. It's objectively a terribly inefficient technology as it stands now for watt generated per dollar spent.


Using watt's generated per dollar as a metric of success for fusion research (emphasis mine) is a false analogy. No commercial fusion power plant exists, however you seem to want to hold research reactors constructed for the purpose of discovery to the same financial standard as operating coal power plants. When the first commercial fusion reactor is turned on then we can bemoan capital costs using real data. Until then, don't compare apples and oranges.


> "But fusion IS very expensive electricity."

You're falling into the same trap as the article author, you have no way of knowing that.

For example, this article on the dense plasma focus approach to nuclear fusion shows the costs are far below that of other non-nuclear sources:

http://lawrencevilleplasmaphysics.com/cost-advantage-roi/

Even if you take into account the research costs (which I think is a mistake, the cut off point is too hard to define, I mean would you include R&D into petrol-based cars in the solar car R&D costs?) the level of funding that the dense plasma focus has had to date has been relatively small, it could recoup those costs quickly. You can get a sense for the investment costs involved in one dense plasma focus project here (though admittedly this doesn't include all similar research projects, of which there are a few):

http://www.integrityresearchinstitute.org/FutureEnergy/Focus...

So you see, because you have no way of knowing which fusion approach would win out, you cannot say with any certainty that it is expensive electricity.


You have also now way to know if these 0.5ct/kWh are correct. And that article is actually sales material from the plasma fusion company. The wikipedia article on this topic suggests that no progress has been made since 2012.


> "You have also now way to know if these 0.5ct/kWh are correct. And that article is actually sales material from the plasma fusion company."

The article I linked to contained an estimate. The point I'm making is that nobody knows what the final cost will actually be. If you want to play guessing games then you should be happy to know what information is available to base that guess on.

> "The wikipedia article on this topic suggests that no progress has been made since 2012."

Progress has been made since 2012, though the progress has been slower than it could've been as there's been a lack of funding. This year the focus appears to have been on improving one of the key parts. The team update their website fairly frequently, last news update about the project was made less than a month ago:

http://lawrencevilleplasmaphysics.com/news-and-archives/lpp-...


> A lot of money has been going into fusion for many, many years now

Sure, but if you're going to holistically include the cost of the science in the comparison price per KWh you should also balance it against the cost of dealing with the consequences of sustained carbon emissions from other types of power generation.


That's really not an unusual or surprising thing. Most products have research and development costs factored into their prices. It's the primary reason why bleeding edge stuff costs so much.

As far as carbon emissions go, it's hard to compare without knowing what byproducts fusion will generate.


> Sure, but if you're going to holistically include the cost of the science in the comparison price per KWh you should also balance it against the cost of dealing with the consequences of sustained carbon emissions from other types of power generation.

Guess what? They all still beat fusion hands down. Think of all the cows that farted and then died to feed the researchers. Think of all the carbon emissions generated by the actual research. Think about the real, actual return we have today.

You seem to be banking on the fact that we actually CAN produce an efficient fusion reactor in the near future. Total efficiency of our investment is very, very, very dependent on how fast it can crank out.

Now compare it to military investments and you might get somewhere....

EDIT: if you disagree, I'd obviously love feedback. It just seems silly to think that fusion is at all something we can depend on being able to pull out of our ass. The manhattan project was never a sure thing.


> "if you disagree, I'd obviously love feedback."

You've clearly made your mind up that pursuing nuclear fusion is a waste of money, there's no point in debating if you're not prepared to consider that the research could pay off.


I don't make up my mind, I follow the dialectic of the conversation. Who the hell are you used to talking to?

The problem is there are NO ARGUMENTS bounding the possibility of the research paying off, and until you can you're going to just talk past the other person.


Nuclear fusion is relatively well understood from a physics perspective, what remains is an engineering challenge. For the longest time most of the money spent on fusion reactor research has been plowed into a single approach. The good news is that at this point in time a wider variety of approaches are getting decent levels of funding.

That's the reality of the situation, we've gone from a one horse race to something closer to an eight horse race. There's a decent chance that the horse we were backing at first will get overtaken by some of the other entrants. If that is the case, some would argue we've been backing the wrong horse for the last 50 years. What it doesn't mean is that the race is over.

Again, it's important to understand the level of confidence when it comes to nuclear fusion. The science is relatively well understood, we've built plenty of working fusion reactors, in fact some fusion reactors are so simple that high school students have built them. What remains is mostly an engineering challenge, working on various ways of triggering and controlling the plasma, shielding the machines from the forces that would interfere with its function, and working out the best way to capture the energy as an electric current. The sun proves you can have get more energy than you put in, hydrogen bombs prove the same, so the potential exists, we're just looking at how best to harness it.

Is there a chance this research won't pay off in, let's say, 20 years? A slim chance yes, but considering the enormous benefits that's a chance many groups who invest in nuclear fusion are willing to take.


Thorium reactors [0], Lightsail tech [1], and vertical helix-shaped wind turbines, throw in some Stirling Engine action [2].

Why aren't we past all this yet? Oh, right, politics.

  [0] https://en.wikipedia.org/wiki/Liquid_fluoride_thorium_reactor
  [1] http://www.lightsail.com/
  [2] http://science.howstuffworks.com/environmental/green-tech/remediation/slingshot-water-purifier2.htm


> It's not even been developed, yet the author is convinced it will be 'the most expensive way to create electricity ever devised'.

Counting R&D costs, plus the fact that it still doesn't work, implies an ROI that would make anyone agree.

I think "breakthroughs" (e.g. discontinuous breaks in research where research is leapfrogged) are slowing down. This doesn't imply scientific or technological advancements are slowing down, which I think you misread. All the author is arguing is that maybe we should reconsider investing our money in hugely expensive projects whose success is completely unbounded EXPECTING breakthroughs. This just seems rational.


> "Counting R&D costs, plus the fact that it still doesn't work, implies an ROI that would make anyone agree."

The work on nuclear fusion is broader than the tokamak approach you're alluding to. All other forms of fusion have had far lower levels of investment.


I'm not aware of any fusion reactor that has "broken even" with just collecting the energy investment. Until that happens, total efficiency for "nuclear fusion" is going to continue to plummet.


If some of the predicted timelines about excess energy generation from nuclear fusion turn out to be true then it'll work out to be cheaper than coal, even in the short-to-medium term after that goal is reached. It's too early to write such predictions off, unless you know something I don't know about the future.


Ahh, this is what I was lookin for. Do you have a source for these predicted timelines? I'd like to dig in further. I wasn't even aware they existed.


Sure, if you cherry-pick what you consider "progress," you can manipulate the numbers to show we've slowed down our rate of progress.

Also, don't belittle "refinements." Sometimes it's figuring out how to make a single airplane fly that's the easy part -- it's "refining" that idea into a global network of safe air travel where the real hard work and innovation happens.

Anyway. This article is silly.


Not to mention his dismissing computers as "interactive televisions", for example, is absurdly reductionist in an attempt to force his point.

Agreed, terrible article.


I thought that was the stupidest part of the article until I saw this:

"But consider that the amount of funding poured into medical research has skyrocketed in my lifetime, so that the progress per dollar spent surely is going down."

Complaining that research is becoming more expensive in _dollar terms_ is stacking the deck against pretty much any society that has _economic growth_ (read: every society).


This argument is actually correct when it comes to pharmaceuticals. The number of drugs discovered per $1B in (inflation-adjusted) research spending has dropped by a factor of 80 since the 1950's. Do a search on 'Erooms Law' for the relevant papers.


Is that even an argument though? A better way to say the same thing is "Easy problems get solved first". Or another way to say it "You aren't going to spend a billion on a problem if you solve it in 10 million".

What we have here is a measurement problem. For example looking at curing polio and curing AIDS as one issue is disingenuous, it tells us nothing about the complexity of the actual problem we are solving. To cure AIDS we may have to solve many hundreds of different problems, many of them usable in other applications.


Read the paper (https://dl.dropboxusercontent.com/u/85192141/2012-scannell.p...), it addresses these objections. The "easy problems" objection is just a form of the "no true Scotsman" fallacy, where you redefine difficult problems to be the ones you haven't stumbled across a solution for.


Uh, isn't that what makes a difficult problem... difficult? I don't think there's any redefinition going on. I'm not sure I've ever heard somebody say, "this is really easy, I can't figure it out."


The thing is we don't figure out drugs, we find them by inspired accident. The attempt to move to a "let's figure it out" approach with computational methods seems to be what's making us worse at drug discovery.


> A better way to say the same thing is "Easy problems get solved first".

This is a fallacy and undecidable. We can't evaluate the "hardness" of the problem or it would be solved already. Polio was WORTH CURING. It's not entirely clear fusion isn't just a pipe dream when we already are investing (and getting great returns) on renewable energy. Is it worth it? It's impossible to say (hello Hume), but we should at least be attempting to put timelines on solid goals to evaluate how shitty we are at guessing how much we should invest for what we understand.


>. We can't evaluate the "hardness" of the problem

Yes we can. Cure polio, attempt to follow the the same steps to cure HIV, if it doesn't work it is a harder problem. It is easy to determine the minimum hardness of a problem by attempting the solutions we already know. What you are confusing is the maximum potential hardness of a problem.


We can vaccinate people against polio, but not against HIV.

We can effectively treat AIDS, but not polio.

So which is the harder problem?


> Complaining that research is becoming more expensive in _dollar terms_ is stacking the deck against pretty much any society that has _economic growth_ (read: every society).

Sure, but try quantifying a return on medical research investment. It makes much more sense in energy production.


Next step: the Internet is a glorified telegraph.


As an example, today's cars are radically different beasts compared to what we had in 1950 (or even 1980). They're safe, faster, lighter, and full of subtle drive-by-wire algorithms and technology. And there's a huge amount of science and engineering advances that made that possible.

While I agree with his broad point that in-your-face radical advances are on the decline, there's a lot going on under the hood that's easy to miss. Taking that to an absurd extreme, if we all suddenly became immortal, it would be a huge advance, but we wouldn't look any different.


I agree with your points. However, I also feel that lately, the term "high-tech" has been subject to inflation. For example, nowadays, it seems that anybody who does something a little out of the ordinary with CSS is considered a "high-tech" worker. Or a company that basically runs a website that brings together service and demand is considered "high-tech".


Have to agree. AirBnB and Uber are probably the biggest non-tech companies masquerading as tech companies. I can reserve a room on Hilton's website, doesn't make them a tech company.


This article seems to ignore the speed at which things can be done nowadays, which makes all those 'familiar' things that bit more amazing to a 1950's traveller. It also seems to set the metric for progress as the visible physical changes in the world around us. This ignores the amazing things we can do today.

Need to get to the other side of the world? 24 hours max. Need to write to someone other side of the world - instant email. Need to face to face with someone on the other side of the world - instant video.

Live TV from everywhere, instant access to the worlds information fight in your pocket, instant social networking when needed too.

Need something built for you? Commission anyone from anywhere in the world in moments. Maybe even 3D print your complex project overnight.

Got a problem? Ask for help and get responses from people around the world in seconds. Got a complaint? Publish your grievance and have it shared around millions of times, maybe even get together with your countrymen and bring down a tyrannical government using just the power of social communication.

What about simulation and the power computing gives us to 'get it right first time'? We can do incredible things with minimal physical testing thanks to computer simulation.

And medically - the author skips over almost the entire field of transplant surgery as just one single example. We've progressed amazingly far in medicine. Even in the last 15 years huge advances have been made in trauma care (thanks in part to Iraq/Afghan wars) that would be astounding to doctors of the 1990's let alone the 1950's.

We live in a wondrous world full of things that we take for granted. Maybe the author doesn't realise just how much more connected we are nowadays and just how much more empowered an individual is in terms of knowledge and creative potential. And that's not to speak of the incredible advances in energy, materials science, social progress, reduction of poverty, transport, communication, education and scores of other fields.

Just because there aren't such highly visible physical changes immediately apparent (though they're there), that doesn't mean 2015 would be any less of a stunning place.


You're right, but you're not rebutting his point, you're actually reinforcing it: all of those things were possible in 1950, we just made them faster. The whole point of the post is that very little is completely new, meaning (close to) not understandable to a 1950 man:

> They will no doubt be impressed with miniaturization as an evolutionary spectacle, but will tend to have a context for the functional capabilities of our gizmos

You say that we can now face to face with people from the other side of the world in real time -- he says if there was as much innovation (and not just refinement) as there was between 1885 and 1950, we should be able to interact with a hologram of someone from the other side of the Solar System.

The real change a 1950's man would see is probably how more linked humans are (mostly first world humans though) and how much more interaction there can be.


Welcome to science.

First rule, exponential growth is not capable forever.

Second rule, duplicating what is already there is far easier than creating something new.

The second rule is what is important. All of our 'innovation' from 1750 to 1950 was because we decided to interact with the world using the scientific method. Once we started correctly classifying observations and experiments we quickly learned the limits of physics. Light speed is light speed and there is no getting around it. Instantaneous communications over long distance is not possible. The problem we have is non scientists attempted to make predictions of the future with no understanding of how reality works. This gave us the incorrect expectation that we could use unlimited energy forever. Just using more and more energy to get the job done is not innovative.

With all the easily observable gains taken early, the things we do now are highly innovative, but require massive amounts of data to do. The reduction in the amount of energy required to accomplish a task will be the legacy of our century.


The distinctions drawn by him (and you) seem entirely arbitrary to me. Why is an atomic bomb more than just a big bomb, and voice transmission more than just an extension of signal transmission (telegrams), but somehow real-time transmission of video is nothing new? How do the first two somehow transcend "refinement" but the latter doesn't?


Yeah, medicine is nuts. The whole field of medical imaging is magic. fMRI still blows my mind. But everything medicine is swept together as one advancement, just as everything computing related is.


To put this another way, most of medical advancement is because of computing advancement. Simply put, the amount of information processing that must occur before a medical advancement is discovered is beyond what humans with paper and pencils are capable of. The story of our future is the story of the computer.


Most of these have been possible for many decades if not longer, even if it wasn't done the same way. Before e-mail there was fax and telegraph before that. Video transmission is nothing new either, and it's honestly not really that useful when you just want to talk with somebody. I bet that by far the most common use is erotic chat...

Nothing revolutionarized transport more than the railroad. Airplanes have changed very little since the 1960's. Cars have changed much more, but for all practical purposes they work exactly the same. In 1885, cars and airships were little more than a novelty. Electricity became available in the late 19th century and much of electric devices we use today were developed during the first half of the 20th century.

Medicine advanced very little. Maybe we are better at treating certain injuries, but for most things, medicine can only keep you dying longer than it used to. The decades before that saw major breaktroughs like antibiotics and vaccines that actually worked and malaria was elliminated in much of the developed world.

We live in a world of things we don't even fully realize we have.


In 1885 people were flying in balloons. In 1950 radios were ordinarily furniture, portable radios were luggage, and radio operation was a military rating. A mere twenty years ago, I accessed the internet with bauds, phone numbers, parity and gopher. A SVGA resolution picture [800x600 for anyone on my lawn] took about a cup of coffee to download and the idea of an online encyclopedia was the stuff of science fiction.

And science fiction is largely a cultural invention of the article's second epoch. Around 1950 is when it went from a few authors to a literary genre and ideas about the technological details of the future went from projecting steam into the future as rockets and submarines to projecting computers and radios. We don't have personal helicopters for two reasons: the laws of physics and the laws of information theory. I can order my potatoes and herring online. Strapping on a jetpack and flying to the Isle of Man for my fix, isn't just impossible, it's absurd.

I've got Hari Seldon's prognostic pocket calculator prognosticating Wednesday's weather and wending through the ether to an encyclopedia galactic. I can text you a photograph of it from a chair in the sky.

The article does not list irony among the cultural we got from ancient Egypt, and it is civilization which is not attributed a great sense of humor in the popular cannon. Yet certainly it had been invented by Socrates' day. Fortunately it thrives today. https://youtu.be/ZFsOUbZ0Lr0


It seems to me that the progress we are making is in technologies of great generality, computers, AI, and genetics. Once these reach an inflection point, the world will get impossibly weirder than it ever has.

Genetically informed iterated embryo selection looks like it could lead to children with higher IQs than any human that has ever lived. Regardless of the ethics, someone somewhere is going to do this. A world where the average kindergartener is learning chromodynamics? Talk about a historical discontinuity.

If we get human level artificial intelligence, this again is a historical discontinuity.

If we get whole brain emulation, things will get very weird very quickly.

We're just now getting good VR. False realities that seem real and immersive. This seems like a much bigger jump than that from radio to television.

Maybe we are in a period of stagnation - I'm unsure if this time travel thought experiment is a good means of measuring progress. However, it seems to me the world is about to get much weirder this century.


Well hitting a biological limitation wall is most likely a thing that will happen and at that point AI and genetic engineering might will be required for progress to happen but It's not sure if were there yet, and it's not clear if it will be a "hard" stop or a sluggish ride through much while getting there.

We have fairly limited understanding on how intelligence works at this point, for example while the general consensus is that average human intelligence has evolved even in the past 1000 years it's still not clear how "smart" would say an AMH from circa 100,000 BCE would compare to us today, and many people believe that with correct nurturing they will be indistinguishable form a modern human born into a "high IQ" family.

It seems that excluding extreme cases (e.g. mental deficiencies and the unicorns the likes of Newton) overall intelligence has more to do with your early development than your genetics. This has been getting more and more supportive evidence from adoption studies (including separated twins incidents), use of surrogates, egg and sperm donors show very little correlation between pure genetics and IQ or academic performance.

On the VR part this isn't a really good example, were getting good VR now because the components are dirt cheap and available so it requires much less R&D effort to develop, all of the current VR trends are software based and they can afford to be because they can get cheap and readily available high performance high resolution screens, accelerators and other components required for VR to work. This is exactly the opposite of say the 90's where the likes of SEGA had to develop side projected micro CRT's and specialized LCD's to make even the basic premise of VR feasible. However if there was sufficient reason to pursue VR then well we would've had gotten it much much sooner.

Allot of the technical innovation especially the one that comes to the consumer market is more often than not a perfect combination of available technologies rather than a focused effort. I don't personally see VR as an example of innovation or any meaningful development effort it's probably the best case of using current technology rather than innovating I've seen in the past decade. Heck the VR experience I had as a child in SegaWorld in London 12 years ago impressed me much more than the Oculus Rift 2 SDK i got a year ago.

And this might be an issue there might be less of a pressure today to "innovate" than it was before, conflict was always an integral part of our drive for innovation, and it seems that once the 2 leading nations at the time stopped trying to one up eachother things slowed down.

We could've had a better particle accelerator than the LHC over 30 years ago if the cold war was still going on, heck the only thing the defense department seems to care these days about cutting edge particle physics is still neutrino detection since it can give them a way to detect nuclear submarines.

I for one would not mind if everyone in the world would get a life time dose of the old Soviet Space religion that would inspire everyone to drive for something better, but sadly i think we'll need to have a real impending catastrophe for the bigger nations and the entire planet to start caring about that again.

If you build a doomsday clock that would say earth will self destruct in 50 years we will have a mars colony withing 30 because this is how our species works, but as soon as the only thing we care about is the size of the screen it shouldn't surprise you that some innovation stops because those resources are now being direct at quite pointless things.


>It seems that excluding extreme cases (e.g. mental deficiencies and the unicorns the likes of Newton) overall intelligence has more to do with your early development than your genetics. This has been getting more and more supportive evidence from adoption studies (including separated twins incidents), use of surrogates, egg and sperm donors show very little correlation between pure genetics and IQ or academic performance.

This is, to put it charitably, not the consensus of the relevant science. Once you get adequate nutrition, which western countries have, IQ if influenced far more by genetics than environment. Adoption studies show consistently that adopted children's IQs resemble their genetic parents not their adopted parents. Twins separated at birth tend to have very similar IQs and personalities. Programs like Head Start and brain training regimens have proven to have little impact on IQ and future outcomes - which you wouldn't expect if environment was the prime factor. We are biological robots, we should expect our source code to have a significant influence on our cognition.

The "environment is all that matters" idea is basically Lysenkoism and its popularity is more do to ideology than science. Your statements are from a mirror world. I assure you, you're misinformed.


To put it charitably that's not the case, the majority of the latest studies show that if you are born to a "low-IQ" (less than 90) then genetics play a huge role (I personally have no quarrel with classifying IQ of 90 or lower as a mental deficiency, but i might have been more clearer on that), however for children who are born to average and above-average parents genetics seems to play a minor role compared to environmental factors especially during the early-adulthood development phases (or exactly before your "IQ" seems to stabilize).

Here is an example of a study which is one of the largest (if not the largest) studies that were conducted in this field (over 11,000 individuals) which seem to point to that exact conclusion.

http://pss.sagepub.com/content/early/2013/07/01/095679761247...

Overall it seems that if you born with a higher "genetic" IQ than the environment will have a much bigger influence on the final outcome (average to highly gifted) while if you are born with a (considerably) below average genetic IQ potential then it wont.

edit:

Can't find a non-paywall for the research but there is a high level synopsis and analysis of it for journalists.

http://journalistsresource.org/studies/society/education/nat...


I'd be happy for a couple of references from you (Moshe_Silnorin) and dogma1138.


I don't want to play study tennis with dogma. However, here are some representative examples:

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3182557/

http://cdp.sagepub.com/content/13/4/148

IQ and its heritability are about as uncontroversial among psychiatrists as global warming is among climate scientists.

Here's a good blog post about the personal implications of this, and how this knowledge can be used productively and charitably: http://slatestarcodex.com/2015/01/31/the-parable-of-the-tale...


That's not completely honest and not the main issue here, it's not about heritability and the IQ potential but the actual fixed IQ of adults. People with higher genetic potential for intelligence are considerably more influenced by their environment and for longer periods than people with lower potential.

So unless we are talking about <100 and in many cases <90 IQ individuals the environment plays a bigger role than pure genetics in the final outcome, I've linked a very recent and very extensive study on that same subject.


Dogma, the claim I disagree with is this: overall intelligence has more to do with your early development than your genetics.

You know what has a huge influence on your early development, your genetics. Do you really think I could be as good at math as Terry Tao if I had spent my childhood practicing? If we raise a chimp right, could it beat Terry Tao?

Yes, high IQ people vary more between tests, but much of this is regression to the mean. And still scores are relatively stable. The common superstition that environment and effort matter more than innate capacity is false - also perseverance is hereditary to some extent, too. I have not read the paper, but I very much doubt this effect would mean environment matters more than genetics. If you clone Terry Tao, and let this clone be raised by two average-IQ parents, he will not grow up to be average.

The summery you provided doesn't seem to discount the value of biological interventions to any great extent, as the "extended period of synaptogenesis" would likely have a genetic cause, given our knowledge about the heritability of IQ.


I think it is hard to get progress when number of Americans unsure about evolution increased from 7% to 21% from 1985 to 2005 [1].

I think Idiocracy prediction [2] is what we are getting.

[1] https://www.ncbi.nlm.nih.gov/pubmed/16902112 [2] http://uproxx.com/movies/2014/10/the-many-signs-that-mike-ju...


As long as we don't start watering our fields with Gatorade and elect wrestlemania stars as presidents will be alright...


Reality show star for president isn't looking all that impossible, so we're probably halfway there.


Not just any reality show, Trump has starred in Wrestlemania specifically: https://www.youtube.com/watch?v=MMKFIHRpe7I



I don't think that the author understands just how computers have changed everything.

>The big differences are cell phones (which they will understand to be a sort of telephone, albeit with no cord and capable of sending telegram-like communications, but still figuring that it works via radio waves rather than magic), computers (which they will see as interactive televisions),

sure, someone from the '50s would see games systems as interactive televisions and probably not be too shocked; games systems aren't a world-changing thing (vs. tv) - the same is true if you use cellphones as, you know, mobile telephones.

Think about the information we all have in our pockets now.

Like how to fix things... if you don't repair your own stuff... you should try it one weekend, just for fun. I've used youtube to figure out everything from how to assemble a crazy-complex sony vaio to replacing a bad fan in a fridge, to replacing shocks on my motorcycle.

and the textual information that is available to you is also absolutely huge. We're all carting around a really bad interface to a library that is more vast than the library of congress... in our pockets.

What I think is interesting here is the huge availability of what would have been out of print books in the '50s... books who's copyright has expired; those are available at the snap of the fingers. Even more modern stuff is also available, of course, it's just not as available as the out of copyright stuff.

Actually, I think this is the interesting thing going forward... who can access information, and at what cost. But I think there is sufficient focus on that.


The major changes to "consumer" behaviour were the Aricultural and Industrial revolutions. I'm unsure when the so-called "Information revolution" began - with the first writing, which was Sumerian accounting records (certainly data storage and processing)? Perhaps markets and the "consumer" were the most fundamental change.

We try to accomodate technological change - shape it to existing consumer behaviour - rather than change everything.

A great example is Moore's Law. The mind-boggling doubling of transistor density fortyearly is now routine. We're actually unsettled by the hiccups at 10nm.

Thought experiment: can you imagine a new technology which changed costs so drastically that current societal structures (municipal; national; consumer/market; informational) could not accomodate them, and instead would require another revolution? [limited by imagination, of course]

At some point, we'll have succeeded at modelling human nature, and we're self-limited from further change. Aliens might do things differently (or might not - it coud be that the greater part of humanity is just cooperative intelligence, common to all sapients).


The article reinforces a point I was making recently; biotechnology and medicine is essentially invisible to most people, and they ignore the massive gains made in past decades, and ignore the potential for the near future.

https://www.fightaging.org/archives/2015/10/the-arcane.php

We are in the midst of a transformative revolution in the capabilities of medical technology and other biotechnologies. But most people don't care, take their vague ideas of the present state of medicine as a stable situation, set in stone, and don't devote any attention to medicine until they have to.

Medical research and development is an arcane world: invisible, you don't know anything about it, and yet it is the greatest of influences upon your own personal future, determining whether you will live or die, suffer or be healthy.


I would think anybody from the fifties would be absolutely shocked if you gave them a tablet with an internet connection and showed them how to use it. They could access any newspapers, books, songs, movies from any time, all the information they ever needed, all their friends whereabouts and actions, and could contact them instantly with text, voice or audio. Learn new things or just play fun games. Look at the map of the earth with street views from all around the globe. Even pictures from the surface of mars. And on the more disturbing side they could look up any sick violence or sex stuff to their hearts desire...

Yes it's "kind of incremental" but my god, all this in a device the size of a clipboard pulling stuff out of thin air. Now that's magic, no other word for it.

And this is just a tablet. Plenty more amazing "incremental" improvements. Put this time traveller in a tesla and push the pedal down. No sound and instant acceleration...


The author fails to note the difference in the ease of access to information. Some time ago, if I wanted to e.g. find out what is the half life of some medication in blood, I'd be lucky to find that in encyclopaedia and would probably need to visit a library or consult a pharmacist. Now I can find that out in one minute, without leaving my home.


Specifically you'd probably want the current PDR (physicians desk reference). There were a number of these fat and expensive references for various fields that were still woefully incomplete and non-current by modern standards.


I don't really agree with the author's method of measuring progress. "Progress" is not measured by how surprising today's world is compared with a world, say 100 years ago.

Despite the frenzy around technology and startups today, I do sincerely believe that we are making a lot of "scientific progress". We have more people than ever before doing scientific research and publishing papers. We have more physicists, doctors, more engineers etc. and I would argue that having so many skilled people in itself is "progress". Also, we have come a long way with society: normalizing gay marriage, stripping down racist laws etc.

Anyways, point being: scientific and social progress does not always manifest itself as glitzy gizmos.


Technological progress mostly relies on discoveries made in physics. Things that will amaze a 1885 person in 1950 were mostly made possible with electricity and derivative technologies, plus nuclear physics, and a few other things. Even engines for our flying machines wouldn't have been possible without electricity.

So the reason the leap between 1950 and today is not as noticeable in terms of the author's time machine experiment, may be because there have been no more major, truly groundbreaking discoveries made in physics, as in, e.g. discovering a new force in nature not known before etc.

But the author is of course right in saying that the past 65 years were marked with a lot of hard work rather than earth shattering discoveries.


> Technological progress mostly relies on discoveries made in physics.

I couldn't disagree more. Take the Haber process as a counterexample from chemistry, or vaccinations as a counterexample from medicine.


Medicine is not technology. Materials are, but in the end chemistry is just a subsidiary of physics.


By that logic, physics is merely an ersatz subsidiary of mathematics, therefore technological progress mostly relies on discoveries made in mathematics.

Obligatory xkcd: https://xkcd.com/435/


"Technology is the collection of techniques, skills, methods and processes used in the production of goods or services or in the accomplishment of objectives." (from Wikipedia)

More things are technology than you seem to think. For instance, technology isn't only things.


Really? Such as when Mary-Beth Ruskai had to talk to the quantum chemists to solve inequalities foundational to quantum information today, or when Longuet-Higgins discovered the importance of geometric phase in quantum mechanics? Or when Alden Mead argued the Planck length[0] was on the atomic scale?

[0] http://ctpweb.lns.mit.edu/physics_today/phystoday/Alden-Reps...


I think one of the biggest differences a visitor from 1950 would notice is food.

The Germans were called "Krauts" in the second world war because they ate Sauerkraut, a then prevalent way to preserve some food with vitamins for the winter.

And today you can go into the store in winter, and buy twenty kinds of fresh fruit from all over the world, and twenty more types of vegetables.

You don't have to stockpile for winter, and don't have to change your diet signficantly because fresh stuff is unavailable.

We don't think of that as a huge technological advancment, but it has a huge impact on the daily life. Likewise the prevalence of washing machines. The availability of public and private transport in a way unimaginable to somebody from 1950.


Related: “Eroom’s Law”[1], coined in this Nature paper: http://www.nature.com/nrd/journal/v11/n3/full/nrd3681.html which states that the cost of discovering a new drug has been increasing exponentially for 60 years.

Obligatory log graph: http://blogs.sciencemag.org/pipeline/wp-content/uploads/site...

Some discussion at the “In The Pipeline” blog: http://blogs.sciencemag.org/pipeline/archives/2012/03/08/ero... http://blogs.sciencemag.org/pipeline/archives/2012/03/14/the... http://blogs.sciencemag.org/pipeline/archives/2012/03/12/the...

[1] You may now groan.


Medically, we now have drugs that break up blood clots and give heart attack patients a fighting chance if they get to the hospital fast enough. Same thing with some stroke victims. Today people who would have died from anaphylactic shock from peanut allergies or bee stings can carry around an Epi pen and a couple of Benadryl tablets and have a great chance of surviving an attack[1]. When I was a boy scout, I was taught the sequence of CPR was ABC- Airway, Breathing, Circulation. They very recently changed it to CAB, and there were quite a few other things that we've learned along the way that changed how it works. I understand that the CPR program is changing again at the end of this year.

Of course, these change are small and not on the scale of nuclear fission. But there have been tons of significant medical changes in the last 40 years that increase quality of life and let people live who otherwise would die. And I'm not even getting into the changes I consider even more significant medically, or in AI. Things computers are doing today weren't even imagined a generation ago.

[1] Epi pens started going out to the public in the 80's. If you have a serious allergy that could lead to your airway closing up, and don't yet have an Epi pen, go see your doctor and get some. People still needlessly die from that stuff all the time, and it can be prevented.


This has been said about a billion times before, in many different periods, in many different ways. Breakthroughs often come only after long incubation periods, which to me should be counted as part of the breakthrough, just an unseen part. We are about to see incubation give way to a Cambrian explosion of innovations. I just hope we survive this evolutionary trial.


I agree. To give an example of what I think you're referring to, the field of synthetic biology is evolving rapidly (perhaps too rapidly in my opinion, the implications are fairly far reaching and I'm not sure we have the cultural maturity to handle them). Even if the most visible fruits of this field appear later they'd still be linked to the research happening now.


A big source of the perceived slowdown partially comes from the grant-oriented, near-sighted, and incredibly toxic environment that today's researchers have to deal with.

Making something faster, cheaper, or smaller is incremental progress with immediate payouts. It's easy to sell that as a research project. But some newfangled warp drive? Flying cars? Self sustaining space stations? None of those things have any sort of short term payout, and so they don't get dollars. Or at least, a lot fewer than they probably deserve.

And so we get incremental progress on existing technology instead. If we want more new, then we need to finance it. And that means making investments with 0 return for 20 years. That means financing people who are coming around at things they don't understand with no certainty of financial return.


What's happening today is technological (and socio-technological) change at a great pace. However in order to call it progress most people would require it to be good for us, the people.

That's where the article's concluding question becomes relevant: "How reliant are you on the narrative of progress for your sanity and understanding of our world and its future?"

The answer is that to a thoughtful observer much of the aforementioned change is not of the good kind anymore. People are increasingly disenfranchised in the new techno-social environent; the recent upswell of talk about the robopocalypse etc. is just a projection of trends long underway.

So we end up with feel-good "progressivism" as a prosthetic replacement of actual progress.


Horace Dediu has looked at how quickly technologies are adopted. He doesn't see any increase in speed of technology adoption (he said this in a recent "Critical Path" podcast), which he was expecting and looking for. https://twitter.com/asymco/status/631108777333403648

Technology becomes more complex with time, so new technologies are necessarily more complex, and require an increasing amount of effort to develop. The Wright Bros could research and develop an airplane on their own. I see no chance that two brothers can invent and implement a fusion reactor or spaceflight to Mars on their own.


I think this is a symptom of all the 'technical debt' we have as a society/species in law, morals and other 'difficult' things.

That quote about the future (It's already here - just not evenly distributed) is a good description of the situation.

There's going to be a war (its sort of already started), hopefully we can come out the other end ready to innovate again. Right now it is too dangerous to share some technologies (for example how to harness nuclear energy) so it puts a hamper on innovation.

Basically we're starting to reach our moral limits in more fields meaning we need better morals before we can continue.


https://en.wikipedia.org/wiki/Great_Filter

You're assuming morals can be overcome. Information complexity may simply be an obstacle that cannot be overcome. At higher energy levels technical and societal change may simply be too dangerous, and to survive long term societies must lock themselves into a steady state where innovation is actively quashed.


Well that sucks.

But when faced with problems the only option is to find a solution.


What war are you referring to? and where has it sort of started?


We have indeed had fewer innovations in consumer goods that take up visual space in a living room.

However we have had massive innovation in medicine--many diseases that were once a death sentence are now treatable.

Cars, though still totally recognizable as cars, are much safer than they were. Airplane flights are both cheaper and safer.

Infant mortality has dropped. Certain diseases have been eradicated.

Due in part to the ability to communicate and coordinate economic activity on a global scale, the number of people in extreme poverty has declined since 1950--Not the percentage of people, the absolute number of people.


You're missing the authors point, heck you're even reinstating it yourself:

> Cars, though still totally recognizable as cars, are much safer than they were. Airplane flights are both cheaper and safer.

> Certain diseases have been eradicated.

That's what he's saying. We've mostly (massively) refined existing technology instead of revolutionized it: vaccines, airplanes and cars are all pre-1950 technology.

Not sure if I agree, but it feels like you haven't even read the article.


I think the reason he or others may perceive a slow down in innovation happens because innovation is more incremental, but is released far more quickly, thus making up for small increments.

A good example is the Windows or Mac OS, which took 2-7 years to be released; now it's happening on a yearly basis, and we're finding that too slow. So maybe in some unintuitive sense, we perceive a slow down due to it speeding up in innovation.*

*this of course doesn't apply to everything, and some ideas do takes years to develop a working prototype.


You won't see many planes in 1950, and TV was unheard of on most of the planet. Change comes with different pace to various places of the world.

But yes, we're mostly catching up, not leaping forward.


> But yes, we're mostly catching up, not leaping forward.

What concerns me is that we (or at least some folks) used to leap forward. I'm struck by a comparison to the citizens of the Soviet sphere of influence: they were almost all materially much better off than they had been years before, but they weren't as much better off as citizens of market states. The Soviet system did, indeed, enable them to catch up and even surpass the Britain of 1914 or the America of 1917—but that didn't seem as good a deal in 1987.

I wonder how much potential innovation and improvement has died in the cradle due to regulation, financial disincentives, cultural aversion &c. Sure: we're better off now than we were a decade ago, but how much better off could we be?


We could be a lot better. Assuming "we" is "people from reasonably developed country".

We could work 6 hours per day, 4 days a week.

We could avoid having problems with drugs, ghettos, extreme poverty - that sort of self-breeding thing.

We could live in fairer world, with less government supervision. Right now there are whole classes of businesses for which ripping customers off is okay (cell companies for example). This did not have to be the norm.

The price? Some undeveloped countries will probably be left to their own devices. Which is an unstable situation: once you solve infant mortality for them, you get population boom and then political instability. Hunger, I'm afraid - maybe on grand scale. Some things will be more expensive. Right now services are expensive and goods are cheap. It could be reversed a bit.


I would argue that the advances in industrialization, miniaturization, energy, and computing are no less important that some of the more visibly valuable inventions in prior decades/generations, but that they are largely supporting technology that provides a scaffolding for us to invent upon rather than inventions that [externally] impress by their own merits. And it's not like our race to the moon was technologically more impressive, either; we just had a direct application of technology we decided to pursue. My $.02 is that for the past 20 years or so, when high-tech manufacturing exploded the domestic economies of the developing world, we've been in a dichotomous environment where the advanced nations continue to pursue original research [but in a largely aimless manner due to all sorts of political incompetence] while the rest of the world accelerates their internal development and jockey for position to gain seats at the table as humanity collectively decide what's next.

"Next" may be manned flight to mars, or it may be elimination of fossil fuels, or eradication of disease, or something completely off the wall, but the point is that -- excepting poverty, war and education -- the world is in a pretty good place right now and the only thing separating most countries from another development-wise is money, not invention.


> They will no doubt be impressed with miniaturization as an evolutionary spectacle

They might also take note of how difficult these devices are to repair


In terms of material progress, you need to understand fundamentals before you can apply them.

There are only so many fundamental things about the universe. We are still learning, it's true. But we've pushed the darkness back on a lot of fronts since the late 1800s. You kind of expect that once science takes a hold there will be fewer mysteries.

As it happens, the late 1800s and early 1900s were boom times for fundamental science. Radioactivity, Relativity, Quantum, and so on.

Once you have the fundamentals, you need some economic organisation to occur before thing like mobile phones appear. People need to be figuring out exactly how the theories will be useful, what kinds of work there will be in the area, and so forth. Much of what we've seen lately is this, finding niches for things we know about the natural world.

I can understand the author's disappointment. Not many things that are new are undreamed of. But remember your imagination depends on how you think the world works, and we now know more about that.


In a related vein, one transformative idea (http://www.theguardian.com/science/small-world/2013/oct/21/b...) has been mostly ignored over the last 15 years or so -- as if "Space travel is utter bilge" had won out instead of Sputnik. At some point the continued progress in fields like biomolecular engineering is going to make the potential too obvious to miss, and we'll be reminded that technology is not a synonym for computers.

(This is another 'jam tomorrow' response, like a lot of others here. I pretty much agree with the OP about the progress seen before and after 1950.)


Whether you view a new fangled technology as "magic" depends on if you understand how it works. Scientists in the 1800s knew much about electricity, combustion, etc well before a "19th century rube":

Our 19th Century rube would fail to recognize cars/trucks, airplanes, helicopters, and rockets; radio, and television (the telephone was 1875, so just missed this one); toasters, blenders, and electric ranges. Also unknown to the world of 1885 are inventions like radar, nuclear fission, and atomic bombs. The list could go on. Daily life would have undergone so many changes that the old timer would be pretty bewildered, I imagine. It would appear as if the world had blossomed with magic: voices from afar; miniature people dancing in a little picture box; zooming along wide, hard, flat roads at unimaginable speeds—much faster than when uncle Billy’s horse got into the cayenne pepper. The list of “magic” devices would seem to be innumerable.

Similarly the perspective of the average 21st century "rube" (though likely to be better educated than his 19th century counterpart) has little understanding of computing or networking, or genome sequencing etc.

Growing up in the 90s reading Bruce Coville's "my teacher is an alien" series, I remember being awed by the description of a "universal translator" device gifted by aliens to the main character. Yet within 20 years, everyone I know has a much more capable version of that device in their pocket.

The author obviously has a right to his opinions, but it's clear that this article is just that: an opinion piece. Since he's a physicist, his perspective might be somewhat muddled by the place of science in technological progress:

Science > Engineering > Market Adoption

Today's scientists are discovering things that tomorrow's engineers will turn into technologies that day after tomorrow's entrepreneurs will consumerize. It's worked this way throughout history, and yes, it's faster now than ever before.


This guy is a perpetual Chicken Little. He's been featured on /r/economics multiple times. His argument is two-pronged: he plays up the inventions of yesteryear, and plays down the inventions of today.

Even the invention of the television was gradual. It was developed via "refinements" of the facsimilie. Ignorance of history is a bad argument. Ditto antibiotics: first arsenic, then atoxyl, then arsphenamine, then sulfanilamide, finally penicillin. The car was nothing but a road-adapted train. The airplane was preceded by ground-towed heavier-than-air craft. And NASA's budget was massive relative to the Wrights'. The telephone came from the telegraph, and the steam engine might have been invented in Alexandria when Jesus was around (assuming He existed).


At least in one domain, life expectancy, progress hasn't stopped. It just might not anymore have the extreme momentum of earlier times: https://thaddeusktsim.wordpress.com/2012/02/24/were-there-no...

Same for GDP. Would be interested in median income over the last 100 years though. http://ourworldindata.org/data/growth-and-distribution-of-pr...


This article is a hypothesis looking for evidence. And ignoring evidence blatantly when it doesn't fit his hypothesis. And I'm not even sure what his hypothesis is apart from the statement that we're not inventing electricity a second time.


It's worth noting that Peter Thiel has also advanced a version of this argument (progress stagnant since 1970): http://www.technologyreview.com/qa/530901/technology-stalled...

along with Tyler Cowen (low hanging fruit has been picked, specifically in the US): https://en.wikipedia.org/wiki/The_Great_Stagnation


Yes, the pace of fundamental inventions has slowed down, but that has nothing to do with progress.

I think the author confuses two ideas: progress and technological inventions. Progress is not defined by the appearance of new "fundamental" inventions. A lot of it comes from how we use technology. Sure the Internet was just a bigger network, but by being a world-wide network instead of a LAN, it has enabled many things previously not possible. The author's take on technological inventions is very narrow. He's only looking for new "fundamental" inventions.

And new inventions have nothing to do with progress. You know what I call progress?

"Economic growth over the last 200 years completely transformed our world, and poverty fell continuously over the last two centuries." -- http://ourworldindata.org/data/growth-and-distribution-of-pr...

That's the only real, measurable progress. Everything else he lists as "fundamental" inventions are simply a means to an end. They are not progress itself. Finding the cure for smallpox isn't progress. Progress is curing smallpox. Nuclear fusion isn't progress. Progress is getting it developed enough for it to become a real energy source and getting people to use it. Even the author acknowledges this, but he's way too confused on inventions vs. progress.

Finally, he adds a section on social progress. It's hard to argue that at all. Yes, we're more open to differences ... in the West. But obviously, the author has completely forgotten about such nation-states as Russia, Saudi Arabia, Qatar, UAE, Afghanistan, China ... yeah, I really don't need to keep going. Global social progress is indeed a myth, but I do think certain parts of the world are making some progress. Finally, he also ends up equating decline of 'social progress' with lack of progress which is ridiculous because social progress is only local and just because something reverts back to the way it used to be doesn't mean progress wasn't made for a time (although this is a rather popular, yet extremely wrong idea in the US).


Open Question: do we think that we basically understand all the laws of nature at this point?

If that were true, we should expect few new fundamental breakthroughs... ever. We will simply explore the vast combinatorial space of what is possible within these laws, essentially remixing the 20th century forever.

I am personally skeptical of this. I think it might be an illusion created by dogmatizing what we have now and being a bit too satisfied with that. Same thing happened with the wisdom of the greeks, taking us over a thousand years to finally realize heavier objects don't fall faster.


If one looks at GDP per capita or life expectancy, progress did not stop or slow in the 50's.

I think the main difference is that progress is now made in much smaller increments. E.g. smartphones have transformed our lives, and yet the smartphone era really begin with the iPhone, which was itself not revolutionary from a purely technological viewpoint. It was however, sufficiently better in its hardware and software to appeal to a much larger market.


Cancer, Multiple Sclerosis, and a raft of other pernicious diseases resist cures despite large continuing investments.

I'd much rather get cancer today than in 1970: https://c0nc0rdance.files.wordpress.com/2012/03/crukmig_1000...


I only agree about the 'energy predicament' not being taken seriously. I don't think progress can be measured like he described. The steam-engine and frequency signaling where huge, transformed our society greatly and still are the base of many things we do now. But he sounds like "there are no hoverboards so progress must be declining".


Part of the issue here is classifying non-linear ideas in linear terms (i.e. Bubble or no Bubble). What we regard as the tech industry is an integration of countless individual markets, each moving at their own relative pace. A collapse in something like the app-game market wouldn't necessarily dictate a collapse in AI.


I've always wondered if there was a limit to human knowledge.

Just like a Chimpanze can't even 'comprehend' how to use a cell phone.....

If we are indeed animals and monkey descendants would that mean that there's a limit of understanding the universe that human will never be able to surpass?

And are we getting closer to that limit?


The human body is chock full of limits. So much so that you are using electrons modulated at different voltages to communicate to another person instead of using analog mouth vibrations right now. The story of humanity is one of using tools to surpass those limits. Now as our technology advances we can understand the nature of human limitations, and possible novel ways of overcoming them, such as gene therapy or direct brain to computer interfaces. There is also the possibility that we will create our replacement capable of surviving the harsh realities outside of our comfortable planet.


Of course there's a limit. We don't know whether we're close to a limit, but there is a limit.


Humans as they currently exist? Certainly. But never? Certainly not. We can evolve further.


This would definitely look like sci-fi to someone jumping from 1950: https://www.ted.com/talks/hugh_herr_the_new_bionics_that_let...


Since around the same time, income inequality has grown massively and rises in wages have become mostly confined to the top earners.

I wonder if these are related.

The era of unrelenting progress and ever-rising living standards is gone. It was a real thing for a few decades, but I think it's coming to an end.


I'm not quite convinced by the article, but I was thinking something economically-related as I was reading it. The economic system is now capitalist, with the main objective being the search for profits, and it seems that it's become really hard nowadays to get a research grant if it does not have a clear money-making application beyond it. As a result you have a lot of projects that will build on existing science or technology, but relatively few that can afford to take the risk to explore the unknown domain of fundamental research.


Income inequality in the United States has grown massively.


I'm talking about the Western world, yes. But is this not also true in others parts of the world? Has anywhere seen income inequality decline?


Yes, inter-country inequality has been coming down despite in-country inequality going up. The rich have been getting richer and the poorest have been getting richer while the middle class have stood still.


Just taken at face value there is nothing surprising about the article's premise. As we continue to increase our knowledge more problems move into the "solved" category, and what remains is the harder stuff.


Could it be the case that everything that has been 'discovered' in history has at some point been 'the harder stuff'?


Well yes, but I am thinking on a more meta scale... or do we believe there is an endless supply of hard problems? At some point, assuming we continue, will we not have figured it all out?


At the end of 2015, major breakthroughs are combined science & tech efforts involving so many players and so many small, incremental steps that we may even not be aware that they are just happening under our eyes.


We live in a dream world where we believe technology can eventually save us from our sins. We dream that technology will allow us to live unsustainably, forever in a world with limited resources.

Unfortunately, we are bounded by limitations in physics, energy and nature. Technology is derived from the same universe, and therefore, like all other things in nature, there is a limit. We can't keep sucking oil out of a well forever.

Whether we have reached peak technology is debateable. It would be wise to approach such an idea with caution. But to utterly deny that a technological peak even exists is a fools errand.


i don't have much of a reason to believe that breakthroughs need to be regular, since by nature they are unpredictable.

in a monstrously reductive sense, I feel like this article is analyzing a series of coin flips and commenting on a streak of tails.

it doesn't help that the scope of the topic is far broader than could be covered in the given text, and entire fields were ignored out of existence.

in all, I was entertained, so I'm happy.


Looks like calculated trolling. Either it's a social experiment or the author is trying to look for something in responses.

Or he has no idea what he's talking about.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: