I have just finished reading the Infinite Reality by Jim Blascovich and Jeremy Bailenson. It is very easy to read and full of first hand lab experiments. I put a book cover info here. They write on page 7 “This book aims to indulge the reader’s curiosity not only for whatever is just around the corner virtually, but also for the distant future”. Strangely, there is nothing about Self-Programming Artificial Intelligence which will shape the future of virtual and physical realty. I put a little info about this here. Then there is no info in their book about the universal/molecular assembler which is quite possible in distant future. Self-Programming Artificial Intelligence will also be programming the molecules. Take this to the size of all of cosmic universe. I am putting a little info about the Programming the Universe here too. As you can see clearly from all discussions, we really need the automatic search engine working as virtual scientist/writer to overcome these human failures. This is my autamatic internet search engine which I call AutoSEN. I am seeking technical and financial partners for my AutoSEN project. Please follow me at Facebook, Twitter, Utube, and my blogsite at: watson4president.wordpress.com
Self-Programming in AGI Systems
Call for Papers
The behavior of a computer system consists of a sequence of operations. A major difference between conventional systems and intelligent systems is that the former follow predetermined programs provided by human programmers, while the latter systems are capable of “self-programming” in the sense that their behavior is not always explicitly specified by a human, but rather “decided by the system itself” to various degrees. In a broad sense many existing AI techniques can be considered as capable of self-programming, including, for example, abilities for searching, planning, production systems, genetic programming, inductive logic programming, reinforcement learning, reactive agent/robot, adaptive agent/robot and so on. Therefore, self-programming is often achieved via learning, though there are other possibilities.
The AGI-11 Workshop on “Self-Programming in AGI Systems” will provide an opportunity for AGI researchers to discuss these questions.
The workshop is open to submissions that address the above and the related questions. A submission can be in the form of a full-length paper or a short position statement as specified in the CFP of AGI-11.
Submissions should be emailed to deon iiim.is by June 30, 2011. Acceptance notifications will be sent out by July 15, 2011.
All accepted submissions will be kept at a webpage dedicated to the workshop, and future publication possibility (such as a special issue of a journal) will be explored.
Each accepted submission will be presented orally at the workshop, followed by general discussions.
Kristinn R. Thórisson (Reykjavik University, thorisson ru.is)
Thursday, August 4th, 09:00-13:00. Seven accepted papers, five full papers and two short papers. Presentations will be around 20 minutes for full papers and 10 minutes for short- or position papers. Download all compressed (zip) or as individual files below.
A molecular assembler is a “proposed device able to guide chemical reactions by positioning reactive molecules with atomic precision”, as defined by K. Eric Drexler. Some biological molecules such as ribosomes fit this definition, since they receive instructions from messenger RNA and then assemble specific sequences of amino acids to construct protein molecules. However, the term “molecular assembler” usually refers to theoretical human-made devices.
Beginning in 2007, the British Engineering and Physical Sciences Research Council funds development of ribosome-like molecular assemblers. Clearly, molecular assemblers are possible in this limited sense. A technology roadmap project, led by the Battelle Memorial Institute and hosted by several U.S. National Laboratories has explored a range of atomically precise fabrication technologies, including both early-generation and longer-term prospects for programmable molecular assembly; the report was released in December, 2007.
Likewise, the term “molecular assembler” has been used in science fiction and popular culture to refer to a wide range of fantastic atom-manipulating nanomachines, many of which may be physically impossible in reality. Much of the controversy regarding “molecular assemblers” results from the confusion in the use of the name for both technical concepts and popular fantasies. In 1992, Drexler introduced the related but better-understood term “molecular manufacturing,” which he defined as the programmed “chemical synthesis of complex structures by mechanically positioning reactive molecules, not by manipulating individual atoms.”
This article mostly discusses “molecular assemblers” in the popular sense. These include hypothetical machines that manipulate individual atoms and machines with organism-like self-replicating abilities, mobility, ability to consume food, and so forth. These are quite different from devices that merely (as defined above) “guide chemical reactions by positioning reactive molecules with atomic precision”.
Because synthetic molecular assemblers have never been constructed and because of the confusion regarding the meaning of the term, there has been much controversy as to whether “molecular assemblers” are possible or simply science fiction. Confusion and controversy also stem from their classification as nanotechnology, which is an active area of laboratory research which has already been applied to the production of real products; however, there had been, until recently, no research efforts into the actual construction of “molecular assemblers”. A primary criticism of the computational research into products of advanced “molecular assemblers” is that the structures investigated are impossible to make today.
A nanofactory is a proposed system in which nanomachines (resembling molecular assemblers, or industrial robot arms) would combine reactive molecules via mechanosynthesis to build larger atomically precise parts. These, in turn, would be assembled by positioning mechanisms of assorted sizes to build macroscopic (visible) but still atomically-precise products.
A typical nanofactory would fit in a desktop box, in the vision of K. Eric Drexler published in Nanosystems: Molecular Machinery, Manufacturing and Computation (1992), a notable work of “exploratory engineering“. During the last decade, others have extended the nanofactory concept, including an analysis of nanofactory convergent assembly by Ralph Merkle, a systems design of a replicating nanofactory architecture by J. Storrs Hall, Forrest Bishop’s “Universal Assembler”, the patented exponential assembly process by Zyvex, and a top-level systems design for a ‘primitive nanofactory’ by Chris Phoenix (Director of Research at the Center for Responsible Nanotechnology). All of these nanofactory designs (and more) are summarized in Chapter 4 of Kinematic Self-Replicating Machines (2004) by Robert Freitas and Ralph Merkle. The Nanofactory Collaboration, founded by Freitas and Merklel in 2000, is a focused ongoing effort involving 23 researchers from 10 organizations and 4 countries that is developing a practical research agenda specifically aimed at positionally-controlled diamond mechanosynthesis and diamondoid nanofactory development.
In 2005, a computer-animated short film of the nanofactory concept was produced by John Burch, in collaboration with Drexler. Such visions have been the subject of much debate, on several intellectual levels. No one has discovered an insurmountable problem with the underlying theories and no one has proved that the theories can be translated into practice. However, the debate continues, with some of it being summarized in the Molecular nanotechnology article.
If nanofactories could be built, severe disruption to the world economy would be one of many possible negative impacts, though it could be argued that this disruption would have little negative effect if everyone had such nanofactories. Great benefits also would be anticipated. Various works of science fiction have explored these and similar concepts. The potential for such devices was part of the mandate of a major UK study led by mechanical engineering professor Dame Ann Dowling. The report is now complete.
“Molecular assemblers” have been confused with self-replicating machines. To produce a practical quantity of a desired product, the nanoscale size of a typical science fiction universal molecular assembler requires an extremely large number of such devices. However, a single such theoretical molecular assembler might be programmed to self-replicate, constructing many copies of itself. This would allow an exponential rate of production. Then after sufficient quantities of the molecular assemblers were available, they would then be re-programmed for production of the desired product. However, if self-replication of molecular assemblers were not restrained then it might lead to competition with naturally occurring organisms. This has been called ecophagy or the grey goo problem.
One method to building molecular assemblers is to mimic evolutionary processes employed by biological systems. Biological evolution proceeds by random variation combined with culling of the less-successful variants and reproduction of the more-successful variants. Production of complex molecular assemblers might be evolved from simpler systems since “A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works.” However, most published safety guidelines include “recommendations against developing … replicator designs which permit surviving mutation or undergoing evolution”.
Most assembler designs keep the “source code” external to the physical assembler. At each step of a manufacturing process, that step is read from an ordinary computer file and “broadcast” to all the assemblers. If any assembler gets out of range of that computer, or when the link between that computer and the assemblers is broken, or when that computer is unplugged, the assemblers stop replicating. Such a “broadcast architecture” is one of the safety features recommended by the “Foresight Guidelines on Molecular Nanotechnology“, and a map of the 137-dimensional replicator design space recently published by Freitas and Merkle provides numerous practical methods by which replicators can be safely controlled by good design.
Drexler and Smalley debate
Main article: Drexler–Smalley debate on molecular nanotechnology
One of the most outspoken critics of some concepts of “molecular assemblers” was Professor Richard Smalley (1943–2005) who won the Nobel prize for his contributions to the field of nanotechnology. Smalley believed that such assemblers were not physically possible and introduced scientific objections to them. His two principal technical objections were termed the “fat fingers problem” and the “sticky fingers problem”. He believed these would exclude the possibility of “molecular assemblers” that worked by precision picking and placing of individual atoms. Drexler and coworkers responded to these two issues in a 2001 publication.
Smalley also believed that Drexler’s speculations about apocalyptic dangers of self-replicating machines that have been equated with “molecular assemblers” would threaten the public support for development of nanotechnology. To address the debate between Drexler and Smalley regarding molecular assemblers Chemical & Engineering News published a point-counterpoint consisting of an exchange of letters that addressed the issues.
Speculation on the power of systems that have been called “molecular assemblers” has sparked a wider political discussion on the implication of nanotechnology. This is in part due to the fact that nanotechnology is a very broad term and could include “molecular assemblers.” Discussion of the possible implications of fantastic molecular assemblers has prompted calls for regulation of current and future nanotechnology. There are very real concerns with the potential health and ecological impact of nanotechnology that is being integrated in manufactured products. Greenpeace for instance commissioned a report concerning nanotechnology in which they express concern into the toxicity of nanomaterials that have been introduced in the environment. However, it makes only passing references to “assembler” technology. The UK Royal Society and UK Royal Academy of Engineering also commissioned a report entitled “Nanoscience and nanotechnologies: opportunities and uncertainties” regarding the larger social and ecological implications on nanotechnology. This report does not discuss the threat posed by potential so-called “molecular assemblers.”
Formal scientific review
In 2006, U.S. National Academy of Sciences released the report of a study of molecular manufacturing as part of a longer report, A Matter of Size: Triennial Review of the National Nanotechnology Initiative The study committee reviewed the technical content of Nanosystems, and in its conclusion states that no current theoretical analysis can be considered definitive regarding several questions of potential system performance, and that optimal paths for implementing high-performance systems cannot be predicted with confidence. It recommends experimental research to advance knowledge in this area:
“Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths that might lead to systems which greatly exceed the thermodynamic efficiencies and other capabilities of biological systems cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal.”
Main article: Grey goo
One potential scenario that has been envisioned is out-of-control self-replicating molecular assemblers in the form of grey goo which consumes carbon to continue its replication. If unchecked such mechanical replication could potentially consume whole ecoregions or the whole Earth (ecophagy), or it could simply outcompete natural lifeforms for necessary resources such as carbon, ATP, or UV light (which some nanomotor examples run on). It is worth noting that the ecophagy and ‘grey goo’ scenarios, like synthetic molecular assemblers, are based upon still-theoretical technologies that have not yet been demonstrated experimentally.
Main article: Nanotechnology in fiction
Molecular assemblers are a popular topic in science fiction, for example, the matter compiler in The Diamond Age and the cornucopia machine in Singularity Sky. The replicator in Star Trek might also be considered a molecular assembler. A molecular assembler is also a key element of the plot of the computer game Deus Ex (called a “universal constructor” in the game).
Programming the Universe
From Wikipedia, the free encyclopedia
Programming the Universe is a 2006 popular science book by Seth Lloyd, professor of mechanical engineering at the Massachusetts Institute of Technology. The book proposes that the universe is a quantum computer, and advances in the understanding of physics may come from viewing entropy as a phenomenon of information, rather than simply thermodynamics. Lloyd also postulates that the universe can be fully simulated using a quantum computer, however in the absence of a theory of quantum gravity, such a simulation is not yet possible.
Reviewer Corey S. Powell of The New York Times writes:
In the space of 221 dense, frequently thrilling and occasionally exasperating pages, … tackles computer logic, thermodynamics, chaos theory, complexity, quantum mechanics, cosmology, consciousness, sex and the origin of life — throwing in, for good measure, a heartbreaking afterword that repaints the significance of all that has come before. The source of all this intellectual mayhem is the kind of Big Idea so prevalent in popular science books these days. Lloyd, a professor of mechanical engineering at M.I.T., takes as his topic the fundamental workings of the universe…, which he thinks has been horribly misunderstood. Scientists have looked at it as a ragtag collection of particles and fields while failing to see what it is as a majestic whole: an enormous computer.
In an interview with Wired magazine, Lloyd writes:
everything in the universe is made of bits. Not chunks of stuff, but chunks of information — ones and zeros. … Atoms and electrons are bits. Atomic collisions are “ops.” Machine language is the laws of physics. The universe is a quantum computer.
offers brilliantly clarifying explanations of the “bit,” the smallest unit of information; how bits change their state; and how changes-of-state can be registered on atoms via quantum-mechanical qualities such as “spin” and “superposition.” Putting readers in the know about quantum computation, Lloyd then informs them that it may well be the answer to physicists’ search for a unified theory of everything. Exploring big questions in accessible, comprehensive fashion, Lloyd’s work is of vital importance to the general-science audience.