Friday, September 24, 2010

Singularity Quote from Eliezer Yudkowsky

Singularity Quote from Eliezer Yudkowsky: "


So let’s say you have an Artificial Intelligence that thinks enormously faster than a human. How does that affect our world? Well, hypothetically, the AI solves the protein folding problem. And then emails a DNA string to an online service that sequences the DNA, synthesizes the protein, and fedexes the protein back. The proteins self-assemble into a biological machine that builds a machine that builds a machine and then a few days later the AI has full-blown molecular nanotechnology.


So what might an Artificial Intelligence do with nanotechnology? Feed the hungry? Heal the sick? Help us become smarter? Instantly wipe out the human species? Probably it depends on the specific makeup of the AI. See, human beings all have the same cognitive architecture. We all have a prefrontal cortex and limbic system and so on. If you imagine a space of all possible minds, then all human beings are packed into one small dot in mind design space. And then Artificial Intelligence is literally everything else. “AI” just means “a mind that does not work like we do”. So you can’t ask “What will an AI do?” as if all AIs formed a natural kind. There is more than one possible AI.


The impact, of the intelligence explosion, on our world, depends on exactly what kind of minds go through the tipping point.


I would seriously argue that we are heading for the critical point of all human history. Modifying or improving the human brain, or building strong AI, is huge enough on its own. When you consider the intelligence explosion effect, the next few decades could determine the future of intelligent life.


So this is probably the single most important issue in the world. Right now, almost no one is paying serious attention. And the marginal impact of additional efforts could be huge. My nonprofit, the Singularity Institute, is trying to get things started in this area. My own work deals with the stability of goals in self-modifying AI, so we can build an AI and have some idea of what will happen as a result. There’s more to this issue, but I’m out of time. If you’re interested in any of this, please talk to me, this problem needs your attention. Thank you.


– Eliezer Yudkowsky, 5 Minute Singularity Intro

"

No comments:

Post a Comment