Wednesday, February 4, 2009

Singularity U.




From yesterday's Financial Times, "Google and Nasa back new school for futurists". Excerpt:

Google and Nasa are throwing their weight behind a new school for futurists in Silicon Valley to prepare scientists for an era when machines become cleverer than people.

The new institution, known as “Singularity University”, is to be headed by Ray Kurzweil, whose predictions about the exponential pace of technological change have made him a controversial figure in technology circles.

Google and Nasa’s backing demonstrates the growing mainstream acceptance of Mr Kurzweil’s views, which include a claim that before the middle of this century artificial intelligence will outstrip human beings, ushering in a new era of civilisation.

To be housed at Nasa’s Ames Research Center, a stone’s-throw from the Googleplex, the Singularity University will offer courses on biotechnology, nano-technology and artificial intelligence.

The so-called “singularity” is a theorised period of rapid technological progress in the near future. Mr Kurzweil, an American inventor, popularised the term in his 2005 book “The Singularity is Near”.

Proponents say that during the singularity, machines will be able to improve themselves using artificial intelligence and that smarter-than-human computers will solve problems including energy scarcity, climate change and hunger.


So much for the good news. The article continues:

Yet many critics call the singularity dangerous. Some worry that a malicious artificial intelligence might annihilate the human race.


After numerous sci-fi movies and TV shows have dramatized this scenario, from 1970's Colossus: The Forbin Project (the subject of the movie poster above) to current TV shows Terminator: The Sarah Connor Chronicles1,2 and Battlestar Galactica1, we won't be able to say that we weren't warned.

The image of the movie poster above is from Rodney-Martin.com


1Critically acclaimed.
2Consistently entertaining.

9 comments:

Anonymous said...

Paul and I went to see The Colossus on our first date!

Anonymous said...

A faaaaaar bigger technological threat than malicious seed AI, is biological weaponry. It is potentially much more cataclysmic than anything AI could ever possibly do (didn't the Matrix show us that all we would need are EMP weapons?!) and not only that, it is feasible for a would be terrorist NOW using equipment you can order online. PCR, DNA sequencers, etc are all available for ordering at home, and not only that, gene sequences for all kinds of baddies can also be found published online. The main threat in strong AI is that it will make tons of jobs obsolete. Not to minimize that.

I like Kurzweil though, even if I think he's too optimistic. I don't think progress should be halted because of the potential for abuse. There are those on both the left and the right who would halt technological progress, in the name of environmentalism, religion, or what have you. In 50 years I can see that being the main political argument of the day...how far we should go in "playing God".

Hey, somebody has to.

DaveinHackensack said...

JK,

Can the two threats be combined, to make them even scarier? E.g., a cybernetic swarm of nano-organisms?

Anonymous said...

You're talking about "Grey Goo". Didn't Crichton write about that also?

So yes, but that's even futher off than seed AI. And by that time, I doubt we'd even be recognizeable as humans. And any threat based on electronics as we conceive now can be taken care of with a relatively primitive EMP weapon. Of course, if transhumanism goes the way Kurzweil and others want it to, we may not be able to use that weapon ourselves either, (as we'd be cyborgs).

Manufactured pathogens as bio weapons are in their own league, threatwise. Because it is possible to create a killer plague now, using relatively inexpensive equipment (or access to a college lab) and gene sequences that are posted online.

AI might not even be possible until quantum computing is realized (sorry D Hofstradter). See here, here, and here.
Laser based quantum computing requires a lot of cooling so carbon nanotubes are probably the way to go. Interestingly, that's analogous to the hypothetical role of the cytoskeleton in the Penrose and Hameroff Orch-Or consiousness model.

Anonymous said...

Just want to mention that the university seems purty nifty. I'd like to attend some day. Good find.

DaveinHackensack said...

"You're talking about "Grey Goo". Didn't Crichton write about that also?"

He wrote a novel about nano-bots that could sort of infect/infiltrate humans and run them as a combined cyber organism, but, if memory serves, the nano-bots weren't themselves biological at all.

I'm going to have to look up half of the other terms you mentioned in your comments here ;-).

DaveinHackensack said...

In an editorial today, the FT expresses skepticism about the imminence of the Singularity: "Singular fantasies".

Anonymous said...

That editorial was amusing. The FT gave no indication they had any basis for their opinion, it was all unsuported assertion.

Some of the most egregious portions:

"But scientists have made little progress in understanding the fundamental nature of human intelligence and consciousness, let alone how to pass it on to computers."

Um...not true? Define "little progress"? On what basis does this opinion rest?

"And even if researchers do endow machines with real intelligence, whatever this may be, why should it suddenly grow exponentially in the way Mr Kurzweil imagines?"

How about they read Mr. Kurzweil to find out? God, this is like reading a high schooler critique a world renowned particle physicist because his theorems are too "out there, dude", while ignoring any analysis of the published work.

Ok, For one, Moore's law.
Secondly, once a computer is more intelligent than a human, it would be able to improve ITSELF better than a human would. This creates a feedback loop of self improvement, which would be exponential.

Now, personally, I think Mr. Kurzweil is on the optimistic side. But sheesus christ, if I were going to criticize a scientist's projections, you'd think I'd put some meat on it. That editorial is so substanceless, it's laughable! I had to keep refreshing the screen to make sure I wasn't missing a second page or something. But no, that was it. Stick to finances, FT.

DaveinHackensack said...

J.K.

If they're that off-base in that editorial, I would think they'll get some letters to the editor ripping them on it. If they publish one, I'll post it here, to go along with your comment.