Search Immortality Topics:



The Singularity When We Merge With AI Won’t Happen – Walter Bradley Center for Natural and Artificial Intelligence

Posted: March 10, 2024 at 3:16 am

Erik J. Larson, who writes about AI here at Mind Matters News, spoke with EP podcast host Jesse Wright earlier this week about the famed/claimed Singularity, among other things. Thats when human and machine supposedly merge into a Super Humachine (?).

Inventor and futurist Ray Kurzweil has been prophesying that for years. But philosopher and computer scientist Larson, author of The Myth of Artificial Intelligence (Harvard 2021), says not so fast.

The podcast below is nearly an hour long but it is handily divided into segments, a virtual Table of Contents. Weve set it at The Fallacy of the Singularity, with selections from the transcript below. But you can click and enjoy the other parts at your convenience.

00:00 Intro 01:10 Misconceptions about AI Progress

 11:48 Bias and Misinformation in AI Models

21:52 The Plateau of Progress & End of Moore’s Law

31:30 The Fallacy of the Singularity

47:27 Preparing for the Future Job Market

Note: Larson blogs at Colligo, if you wish to follow his work.

And now

Decades ago, Larson says, programmers were focused on getting computers to win at complex board games like chess. One outcome was that their model of the human mind was the computer. And that, he says, became a narrative in our culture.

Larson: [33:19] You know, people are kind of just bad versions of computers. If you look at all the literature coming out of psychology and cognitive science and these kind of fields, theyre always pointing out how were full of bias jumping to the wrong conclusions. We cant be trusted. Our brains are very very Yesterdays Tech so to speak.

Choking off innovation?

Larson sees this easy equation of the mind and the computer as choking off innovation, at which humans excel. It encourages people to believe that computers will solve our problems when there are major gaps in their ability to do so. One outcome is that contrary to clich this one of the least innovative periods in a while.

Larson: [34:25] The last decade is one of the least innovative times that weve had in a long time and its sort of dangerous that everybody thinks the opposite. If people said, wait a minute, were just doing tweaks to neural networks; were just doing extensions to existing technology Yes, were making progress but were doing it at the expense of massive amounts of funding, massive amounts of energy consumption, right?

Instead he sees conformity everywhere, accompanied by a tendency to assume that incremental improvements amount to progress in fundamental understanding.

So how does our self-contented mediocrity produce an imminent, unhinged Singularity?

Well, a pinch of magic helps!

Larson: [37:49] Whats underlying that is this idea that once you get smart enough, you also become alive. And thats just not true. A calculator is extremely good at arithmetic. No one can beat a calculator on the face of the planet but that doesnt mean that your calculator has feelings about how its treated. In a sense, theres just a huge glaring error philosophical error thats being made by the Superintelligence folks, the existential risk folks. Thats wasted energy in my view. Thats not whats going to happen.

If a more powerful computer is not like a human mind, whats really going to happen?

Larson: [38:40] Very bad actors are going to use very powerful machines to screw everything up Somebody gets control of these systems and directs them towards ruining Wall Street, ruining the markets, bringing down the power grid. Thats a big threat. The machines themselves I would bet the farm that theyre not going to make the leap from being faster and calculating more complicated problems to being alive in any sort of sense or having any kind of motivations or something that could misalign like that. Thats the Sci-Fi Vibe thats getting pushed into a scientific discussion.

The Singularity depends on a machine model of the mind

Larson: [46:17] If were just a complicated machine, then it stands to reason that at some point well have a more complicated machine. Its just a continuum and were on that. But if you actually remove that premise and say, look were not machines, were not computers then you have an ability to talk about human culture in a way that can actually be healthy. We think differently, we reason differently, we have superior aspects to our behavior and performance, and we actually do care and have motivations about how things turn out unlike the tools we use.

So it looks as though the transhuman could go extinct without ever existing.

You may also wish to read: Tech pioneer Ray Kurzweil: We will merge with computers by 2045. For computers, Even the very best human is just another notch to pass, he told the COSM Technology Summit. Kurzweil explained, To do that, we need to go inside your brain. When we get to the 2030s, we will be able to do that. So a lot of our thinking will be inside the cloud. In another ten years, our non-biological thinking will be much better than our biological thinking. In 2017, he predicted 2045 for a total merger between man and machine.

View post:

The Singularity When We Merge With AI Won't Happen - Walter Bradley Center for Natural and Artificial Intelligence

Recommendation and review posted by G. Smith