FANDOM


A Special Feature

The technological singularity is a hypothesized point in the future variously characterized by the technological creation of self-improving intelligence, unprecedentedly rapid technological progress, or some combination of the two.[1]

Statistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unseen by their designers, and thus recursively augment themselves into far greater intelligences. Vernor Vinge later called this event "the Singularity" as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion. In the 1980s, Vinge popularized the Singularity in lectures, essays, and science fiction. More recently, some AI researchers have voiced concern over the potential dangers of Vinge's Singularity.

Others, most prominently Ray Kurzweil, define the Singularity as a period of extremely rapid technological progress. Kurzweil argues such an event is implied by a long-term pattern of accelerating change that generalizes Moore's Law to technologies predating the integrated circuit and which he argues will continue to other technologies not yet invented.

Critics of Kurzweil's interpretation consider it an example of static analysis, citing particular failures of the predictions of Moore's Law. The Singularity also draws criticism from anarcho-primitivism and environmentalism advocates.

Following its introduction in Vinge's stories, the Singularity has also become a common plot element throughout science fiction.

Intelligence explosionEdit

Template:Harvtxt speculated on the consequences of machines smarter than humans: Template:Cquote

Mathematician and author Vernor Vinge greatly popularized Good’s notion of an intelligence explosion in the 1980s, calling the creation of the first ultraintelligent machine the Singularity. Vinge first addressed the topic in print in the January 1983 issue of Omni magazine. Template:Harvtxt contains the oft-quoted statement, "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly thereafter, the human era will be ended." Vinge clarifies his estimate of the time scales involved, adding, "I'll be surprised if this event occurs before 2005 or after 2030."

Vinge continues by predicting that superhuman intelligences, however created, will be even more able to enhance their own minds faster than the humans that created them. “When greater-than-human intelligence drives progress,” Vinge writes, “that progress will be much more rapid.” This feedback loop of self-improving intelligence, he predicts, will cause large amounts of technological progress within a short period of time.

Most proposed methods for creating smarter-than-human or transhuman minds fall into one of two categories: intelligence amplification of human brains and artificial intelligence. The means speculated to produce intelligence augmentation are numerous, and include bio- and genetic engineering, nootropic drugs, AI assistants, direct brain-computer interfaces, and mind transfer. Despite the numerous speculated means for amplifying human intelligence, non-human artificial intelligence (specifically seed AI) is the most popular option for organizations trying to directly initiate the Singularity, a choice addressed by Template:Harvtxt. Template:Harvtxt is also skeptical of human intelligence augmentation, writing that once one has exhausted the “low-hanging fruit” of easy methods for increasing human intelligence, further improvements will become increasingly difficult to find.

Potential dangersEdit

Some speculate superhuman intelligences may have goals inconsistent with human survival and prosperity. AI researcher Hugo de Garis suggests AIs may simply eliminate the human race, and humans would be powerless to stop them. Other oft-cited dangers include those commonly associated with molecular nanotechnology and genetic engineering. These threats are major issues for both singularity advocates and critics, and were the subject of Bill Joy's Wired magazine article "Why the future doesn't need us" Template:Harv.

Template:Harvtxt discusses human extinction scenarios, and lists superintelligence as a possible cause:

Template:Cquote

Moravec (1992) argues that although super-intelligence in the form of machines may make humans in some sense obsolete as the top intelligence, there will still be room in the ecology for humans.

Some AI researchers have made efforts to diminish what they view as potential dangers associated with the singularity. The Singularity Institute for Artificial Intelligence is a nonprofit research institute for the study and advancement of Friendly Artificial Intelligence, a method proposed by SIAI research fellow Eliezer Yudkowsky for ensuring the stability and safety of AIs that experience Good's "intelligence explosion". AI researcher Bill Hibbard also addresses issues of AI safety and morality in his book Super-Intelligent Machines.

Isaac Asimov’s Three Laws of Robotics are one of the earliest examples of proposed safety measures for AI. The laws are intended to prevent artificially intelligent robots from harming humans. In Asimov’s stories, any perceived problems with the laws tend to arise as a result of a misunderstanding on the part of some human operator; the robots themselves shut down in the case of a real conflict. On the other hand, in works such as the film I, Robot, which was based very loosely on Asimov's stories, a possibility is explored in which AI take complete control over humanity for the purpose of protecting humanity from itself. In 2004, the Singularity Institute launched an Internet campaign called 3 Laws Unsafe to raise awareness of AI safety issues and the inadequacy of Asimov’s laws in particular Template:Harv.

Many Singularitarians consider nanotechnology to be one of the greatest dangers facing humanity. For this reason, they often believe seed AI (an AI capable of making itself smarter) should precede nanotechnology. Others, such as the Foresight Institute, advocate efforts to create molecular nanotechnology, claiming nanotechnology can be made safe for pre-Singularity use or can expedite the arrival of a beneficial Singularity.

Accelerating changeEdit

File:PPTMooresLawai.jpg
, engages in outreach, education, research and advocacy concerning accelerating change Template:Harv. It produces the Accelerating Change conference at Stanford University, and maintains the educational site Acceleration Watch.

CriticismEdit

Some critics assert that no computer or machine will ever achieve human intelligence while others do not rule out the possibility.[2] Theodore Modis and Jonathan Huebner argue that the rate of technological innovation has not only ceased to rise, but is actually now declining. Template:Harvtxt criticizes Huebner's analysis. Some evidence for this is that the rise in computer clock speeds is slowing, however the number of cores is increasing and the cost of chips continues to fall. Template:Harv

Others propose that other "singularities" can be found through analysis of trends in world population, world GDP, and other indices. Andrey Korotayev and others argue that historical hyperbolic growth curves can be attributed to feedback loops that ceased to affect global trends in the 1970s, and thus hyperbolic growth should not be expected in the future.

In "The Progress of Computing", William Nordhaus argues that prior to 1940, computers followed the much slower growth of a traditional industrial economy, thus rejecting extrapolations of Moore's Law to 19th century computers. Template:Harvtxt suggests differences in memory of recent and distant events create an illusion of accelerating change, and that such phenomena may be responsible for past apocalyptic predictions.

Some anarcho-primitivism and eco-anarchism advocates, such as John Zerzan and Derrick Jensen, see the Singularity as an orgy of machine control, and a loss of free existence outside of civilization. Template:Harvs expresses a cautionary environmentalist perspective on the Singularity.

Popular cultureEdit

While discussing the Singularity's growing recognition, Template:Harvtxt writes that "it was the science-fiction writers who felt the first concrete impact." In addition to his own short story "Bookworm, Run!", whose protagonist is a chimpanzee with intelligence augmented by a government experiment, he cites Greg Bear's novel Blood Music as an example of the Singularity in fiction. In William Gibson's 1984 novel Neuromancer, AIs capable of improving their own programs are strictly regulated by special "Turing police" to ensure they never exceed human intelligence, and the plot centers on the efforts of one such AI to circumvent their control. The 1994 novel The Metamorphosis of Prime Intellect features an AI that augments itself so quickly as to gain low-level control of all matter in the Universe in a matter of hours. A more malevolent AI achieves similar levels of omnipotence in Harlan Ellison's short story "I Have No Mouth, and I Must Scream". William Thomas Quick's novels Dreams of Flesh and Sand, Dreams of Gods and Men, and Singularities present an account of the transition through the Singularity; in the latter novel, one of the characters states that it is necessary for Mankind's survival that they achieve an integration with the emerging machine intelligences, or it will be crushed under the dominance of the machines — the greatest risk to the survival of a species reaching this point (and alluding to large numbers of other species that either survived or failed this test, although no actual contact with alien species occurs in the novels).

The Singularity is sometimes addressed in fictional works to explain the event's absence. Neal Asher's Gridlinked series features a future where humans living in the Polity are governed by AIs and while some are resentful, most believe that they are far better governors than any human. In the fourth novel, Polity Agent, it is mentioned that the Singularity is far overdue yet most AIs have decided not to partake in it for reasons that only they know. A character in Ken MacLeod's 1998 novel The Cassini Division dismissively refers to the Singularity as "the Rapture for nerds".

Popular movies in which computers become intelligent and overpower the human race include Colossus: The Forbin Project, the Terminator series, I Robot and The Matrix. The television series Battlestar Galactica also explores these themes.

Isaac Asimov expressed ideas similar to a post-Kurzweilian Singularity in his short story "The Last Question". Asimov's future envisions a reality where a combination of strong artificial intelligence and post-humans consume the cosmos, during a time Kurzweil describes as when "the universe wakes up", the last of his six stages of cosmic evolution as described in The Singularity is Near. Post-human entities throughout various time periods of the story inquire of the artificial intelligence within the story as to how entropy death will be avoided. The AI responds that it lacks sufficient information to come to a conclusion, until the end of the story when the AI does indeed arrive at a solution, and demonstrates it by re-creating the universe, in godlike speech and fashion, from scratch. Notably, it does so in order to fulfill its duty to answer the humans' question.

St. Edward's University chemist Eamonn Healy discusses accelerating change in the film Waking Life. He divides history into increasingly shorter periods, estimating "two billion years for life, six million years for the hominid, a hundred-thousand years for mankind as we know it". He proceeds to human cultural evolution, giving time scales of ten thousand years for agriculture, four hundred years for the scientific revolution, and one hundred fifty years for the industrial revolution. Information is emphasized as providing the basis for the new evolutionary paradigm, with artificial intelligence its culmination. He concludes we will eventually create “neohumans” which will usurp humanity’s present role in scientific and technological progress and allow the exponential trend of accelerating change to continue past the limits of human ability.

Accelerating progress features in some science fiction works, and is a central theme in Charles Stross's Accelerando. Other notable authors that address Singularity-related issues include Karl Schroeder, Greg Egan, David Brin, Iain M. Banks, Neal Stephenson, Tony Ballantyne, Bruce Sterling, Dan Simmons, Damien Broderick, Fredric Brown, Jacek Dukaj, Nagaru Tanigawa and Cory Doctorow. Another relevant work is Warren Ellis’ ongoing comic book series newuniversal.

In the episode "The Turk" of Terminator: The Sarah Connor Chronicles, John Connor mentions "the singularity" which he defines as the moment when machines are smart enough to build smarter, more powerful versions of themselves, without the help of humans.

In the film Screamers - based on Philip K. Dick's short story Second Variety - mankind's own weapons begin to design and assemble themselves. Self replicating machines are often considered to be a significant prerequisite "final phase" - almost like a catalyst to the accelerating progress leading to a Singularity. Interestingly, screamers develop to a level where they will kill each other and one even professes her love for the human. This is an idea common to a lot of Dick's stories, where he explores beyond the simplistic "man vs machine" scenario that features Man's creations considering Him a threat to be eliminated.

NotesEdit

  1. Template:Harvtxt states: "The Singularity is the technological creation of smarter-than-human intelligence." However, Template:Harvtxt writes: Template:Cquote
  2. Template:Harvnb:Template:CquoteTemplate:Harvtxt:Template:Cquote

ReferencesEdit

Template:Refbegin

Template:Refend

External linksEdit

EssaysEdit

Singularity AI projectsEdit

Portals and wikisEdit

FictionEdit

  • After Life by Simon Funk uses a complex narrative structure to explore the relationships among uploaded minds in a technological singularity.

Other linksEdit