Singularity Institute - Wikipedia, the free encyclopedia

Singularity Institute

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The Singularity Institute (SIAI) is a non-profit organization founded in 2000 to develop safe artificial intelligence software, and to raise awareness of both the dangers and potential benefits it believes AI presents. The organization advocates ideas initially put forth by I. J. Good and Vernor Vinge regarding an "intelligence explosion", or Singularity, which is predicted to follow the creation of sufficiently advanced AI. In their view, the potential benefits and risks of this event necessitate the search for solutions to problems involving AI goal systems to ensure powerful AIs are not dangerous when they are created.[1][2] The Singularity Institute espouses the Friendly AI model created by its co-founder Eliezer Yudkowsky as a potential solution to such problems.[3]

Michael Vassar serves as the organization's president, and inventor and futures studies author Ray Kurzweil serves as one of its directors.[4] SIAI maintains an advisory board whose members include Oxford philosopher Nick Bostrom, biomedical gerontologist Aubrey de Grey, PayPal co-founder Peter Thiel, and Foresight Nanotech Institute co-founder Christine Peterson. The SIAI is tax exempt under Section 501(c)(3) of the United States Internal Revenue Code, and has a Canadian branch, SIAI-CA, formed in 2004 and recognized as a Charitable Organization by the Canada Revenue Agency.

[edit] History

SIAI was founded in 2000[5] by AI researcher Eliezer Yudkowsky[6] and Internet entrepreneurs Brian and Sabine Atkins. At first, SIAI operated primarily over the Internet, receiving financial contributions from transhumanists and futurists. On July 23, 2001, SIAI launched the open source Flare Programming Language Project[7], described as an "annotative programming language" with features inspired by Python, Java, C++, Eiffel, Common Lisp, Scheme, Perl, Haskell, and others.[8] The specifications were designed with the complex challenges of seed AI in mind. However, the effort is no longer active, as more focus was put on the problem of Friendly AI.

In 2002, SIAI published on its website the paper Levels of Organization in General Intelligence,[9] a preprint of a book chapter later included in a compilation of general AI theories, entitled "Artificial General Intelligence" (Ben Goertzel and Cassio Pennachin, eds.). Later that year, SIAI released their two main introductory pieces, "What is the Singularity"[10] and "Why Work Toward the Singularity".[11]

In 2003, the Singularity Institute appeared at the Foresight Senior Associates Gathering, where co-founder Eliezer Yudkowsky presented a talk titled "Foundations of Order". They also made an appearance at the Transvision 2003 conference[12] at Yale University with a talk by SIAI volunteer Michael Anissimov.

In 2004, SIAI released AsimovLaws.com,[13] a website that examined AI morality in the context of the "I, Robot" movie starring Will Smith, released just two days later. From July to October, SIAI ran a Fellowship Challenge Grant that raised $35,000 over the course of three months. Early the next year, the Singularity Institute relocated from Atlanta, Georgia to Silicon Valley.

In February 2006, the Singularity Institute completed a $200,000 Singularity Challenge fundraising drive,[14] in which every donation up to $100,000 was matched by Clarium Capital President, Paypal Co-Founder and SIAI Advisor Peter Thiel.[15] The stated uses of the funds included hiring additional full-time staff, an additional full-time research fellow position, and the organization of the Singularity Summit at Stanford.

In 2006, the Singularity Institute, along with the Symbolic Systems Program at Stanford, the Center for Study of Language and Information, KurzweilAI.net, and Peter Thiel, co-sponsored the Singularity Summit at Stanford.[2] The summit took place on 13 May 2006 at Stanford University with Thiel moderating and 1300 in attendance. The keynote speaker was Ray Kurzweil,[16] followed by eleven others: Nick Bostrom, Cory Doctorow, K. Eric Drexler, Douglas Hofstadter, Steve Jurvetson, Bill McKibben, Max More, Christine Peterson, John Smart, Sebastian Thrun, and Eliezer Yudkowsky.

The 2007 Singularity Summit took place on September 8-September 9, 2007, at the Palace of Fine Arts Theatre, San Francisco. A third Singularity Summit took place on October 25, 2008, at the Montgomery Theater in San Jose. The 2009 Singularity Summit took place on October 3, at the 92nd Street Y in New York City, New York. The 2010 Summit was held on August 14–15, 2010, at the Hyatt Regency in San Francisco.[17]

[edit] References

  1. ^ Coming to grips with intelligent machines Stefanie Olsen, staff writer, CNET, September 7, 2007
  2. ^ a b Smarter than thou?, San Francisco Chronicle, 12 May 2006
  3. ^ Encyclopedia of computer science and technology, Harry Henderson page 433
  4. ^ I, Rodney Brooks, Am a Robot
  5. ^ Business Artificial Intelligence Conference in S.J. this week San Jose Mercury News (CA) - October 24, 2008 - 3E Business
  6. ^ Scientists Fear Day Computers Become Smarter Than Humans September 12, 2007
  7. ^ Flare Programming Language Project
  8. ^ Features, flarelang.sourceforge.net
  9. ^ Levels of Organization in General Intelligence
  10. ^ "What is the Singularity"
  11. ^ "Why Work Toward the Singularity"
  12. ^ "Humanity 2.0: transhumanists believe that human nature's a phase we'll outgrow, like adolescence. Someday we'll be full-fledged adult posthumans, with physical and intellectual powers of which we can now only dream. But will progress really make perfect?"
  13. ^ AsimovLaws.com
  14. ^ Singularity Challenge
  15. ^ The Singularity: Humanity's Last Invention?, Martin Kaste, National Public Radio
  16. ^ Public meeting will re-examine future of artificial intelligence Real brains are gathering in San Francisco to ponder the future of artificial intelligence, September 07, 2007. Tom Abate, SFGate.com
  17. ^ Silicon Valley tycoon embraces sci-fi future MSNBC Tech & Science

[edit] External links

Comments