In a guest column, Computers vs. Brains on the Opinionator of The New York Times, Sandra Aamodt and Sam Wang analyzed some of the arguments by inventor Raymond Kurzweil, one of the leading inventors of our time, in his most recent futurist manifesto: “The Singularity Is Near: When Humans Transcend Biology” (2005). Kurzweil estimates that machines will inevitably be able to surpass our thinking capabilities within a few decades. Kurzweil’s speculative reasoning has been heavily debated and challenged.
In Aamodt and Wang’s article they point out that there are fundamental differences between our brains and computers that makes Kurzweil’s predictions improbable. The purpose of this essay is to evaluate the arguments of sides, Kurzweil’s book and Aamodt and Wang’s article. I will attempt to accomplish this by using various critical thinking methods such as defining, clarifying and explaining some of the history of the concepts and the debates involved. To understand the debate we must first clarify what is technological singularity.
A technological singularity is the moment our technological development becomes so rapid that it makes the future after the singularity unpredictable. Writers on singularity, such as Raymond Kurzweil, define the concept in terms of the technological creation of super-intelligence (Kurzweil, 2005). The article points out that any comparison of the brain and computers misses the messy truth about the fundamental differences between them.
The article provides various reasons why the brain is superior to computers and ways in which it is not.The debate focuses on differences on energy consumption, information processing strategies and capacity, and the pros and cons of artificial versus biological between brain and computers. The brain contains many systems that have evolved through natural selection for one task then was adopted for another.
It is efficient for nature to adapt an old system that to build a new one. As such, the brain is composed of the brain stem, the limbic system and the brain cortex carrying out a complex communication with each other that we have yet to decipher. Engineer’s however, have the advantage to start over to get it just right.A persistent problem, with not just artificial intelligence, but all machines, is the tendency of components to fail. Yet our biological neurons and synapses fail all the time, even under healthy conditions, but unlike computers, new neuron connections can form as well as break throughout a lifetime, this provides an infinite potential for new paths and brain activity. The human brain uses 12 watts, which is less than a typical refrigerator light while the memory of an artificial brain would use nearly a gigawatt, the amount currently consumed by all Washington, D.
C. (Aamodt & Wang, 2009).To solve this challenge Kurzweil invoked Moore’s Law. The law is named after Intel co-founders Gordon E. Moore, who originally described the trend in 1965. In this paper Moore noted that number the of transistors in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue “for at least ten years” (Moore, 1965).
His prediction has proved to be incredibly accurate, in part because most technology and electronic industries now use this principle to set targets for research and development (Disco ; Meulen, 1998).For the last four decades memory and chip capacity has doubled every one or two years (est. 2025 to 2030). The article claims that Kurzweil overlooks Moore’s Law power consumption per chip, which has also increased immensely since 1985 (Aamodt & Wang, 2009).
While this is true, it implies that electrical power consumption will continue to grow whereas electrical power and storage technologies such as batteries, fuel cells and renewable energies will remain stagnant.The problem in this logic is that history and technology advancements do not evolve at a constant pace, in fact the capabilities of many digital electronic devices and technologies like energy efficiency are also linked to Moore’s law. New revolutionary technologies are emerging fast to address this issue of increased electrical power generation and storage needs.
An example is teams of MIT scientists that have created a synthetic, self-assembling chloroplast that can be break and reassemble repeatedly, a self-restoring solar cell. Dillow, 2010). Another example is a company called Bloom Energy, which is producing tiny fuel cell boxes called “Bloom Boxes. ” Two of these can power an average U. S. home. Each device is about the size of a standard brick. Although they need to be surrounded by a larger unit that takes in an energy source, they are still about the size of a refrigerator.
This alternative is already being tested by companies such as Google and eBay (Siegler, 2010). One striking feature of brain tissue is its compactness.The memory capacity in this small volume is potentially immense.
For computer capacity, to begin to approach that of a human brain, as we get closer to the physical limits of silicon and other materials used in current computing components like computer chips and memory; manufacturers would possibly need to experiment with other production techniques and materials. As humans have evolved, we have developed the ability to make fast inferences in very complex situations. We can make logical approximations and find “good enough” solutions.This type of decision-making will undoubtedly be hard to match but there are already robots accomplishing seemingly simple tasks like matching socks and playing soccer autonomously, which are actually very complicated tasks.
On good example is ASIMO, which is currently the world’s most advanced humanoid robot; it is the culmination of two decades of research by Honda engineers. ASIMO can run, walk on uneven slopes and surfaces, turn smoothly, climb stairs, and reach for and grasp objects, comprehend and respond to simple voice commands.ASIMO can also have the some face recognition capability and can map its environment and register stationary objects. ASIMO can also avoid moving obstacles as it moves through its environment (Honda. com). However, we bring a large amount of background information to bear on simple tasks, allowing us to make inferences that are difficult for machines. Even the most advanced computer has trouble differentiating a dog and a cat apart, something even a toddler can do without any effort.
In regards to the declaration that engineers can learn from brain strategies (Aamodt & Wang, 2009), it should be noted for the purpose of clarification that researchers and engineers are constantly trying to mimic the tricks that millions of years of evolution and development have taught to biology. If engineers can recreate this tricks and shortcuts, computers that share our imperfections. This may not be exactly what we want, but it could lead to better “soft” judgment or so called “fuzzy” reasoning from computers (Aamodt & Wang, 2009).What determines a computer’s ability to make such inferences is related to the artificial intelligence software not hardware.
In addition several devices can share computing resources through wireless technologies and the implementation of cloud computing strategies. Cloud computing refers to a shift towards internet-based computing, where shared software or information are provided on demand. Why bother? The article concludes that we should no try to achieve the goal of sophisticated artificial intelligence since, human beings are already capable of fulfilling all those requirements and in order to create a new one all it akes is a fertile couple with the resources to nurture their child. Although, it will eventually be possible to design devices that imitate human behavior, we already have such an incredible device in humans. I do agree that humans have achieved an intellectual and self-awareness that seems incomprehensible at times. The article misses one of the most important arguments of Kurzweil’s discussion. The functionality of the brain is computable in terms of technology that we can build in the near future.
We must take into account that preliminary research is already suggesting that prolonged use of such devices is affecting the way our minds work. “The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. Our brain is evolving, she and other researchers compare digital stimulation less to drugs and alcohol than to food and sex, which are essential but counterproductive in excess (Richtel, 2010).We must always remember that Kurzweil describes this technological singularity as resulting from three different technologies and disciplines: genetics, nanotechnology, and robotics (which includes artificial intelligence). Genetic modification is the manipulation of an organism’s genetic material in a way that does not occur under natural conditions.
Nanotechnology is the study of controlling matter on a molecular scale to create new materials and devices with a wide range of applications, such as in medicine, electronics and energy production. Robotics is related to electronics, mechanics, and software.Kurzweil argues we are in the process of reaching new levels of refinement in our technology that will allow us to merge biological and artificial to create higher forms of life and intelligence. The article fails to make a persuasive argument against the inevitably of the technological singularity and the reasons for not pursuing this goal. There are many reasons why people will aim to achieve this such as vanity, medical and cognitive compensation or enhancements. We must not assume that the only goal in producing such technology is to just emulate humans, but to augment and even transcend our biology, as the book title suggests.
References Aamodt, S. ; Wang, S. (March, 2009). Guest Column: Computers vs. Brains, New York Times, Retrieved from: http://opinionator.
blogs. nytimes. com/2009/03/31/guest-column-computers-vs-brains/ Dillow, C.
(2010). MIT’s Self-Assembling Solar Cells Recycle Themselves Repeatedly, Just Like Plant Cells.