Friday, October 17, 2014

This Not That in Determining Computer Consciousness

In Daniel Bor's "The Ravenous Brain," he begins with the philosophical concept of consciousness and brings up the idea that many neuroscientists view the brain as quite similar to a computer. With this idea in mind, he explains the Chinese Room argument, where a randomly chosen man who does not know Mandarin, must enter a room and answer questions that a group of Chinese tourists are sending him through an IN box. Once he answers, he sends his responses through the OUT box. He is able to complete this task with the help of a book that the researcher gave him. He is the one who has bet the Chinese that this man can speak Mandarin. The book shows the man how to convert certain Chinese characters into others, effectively answering all of the questions the tourists ask him. Once this experiment has been completed, the man is made to stay in the room longer and the researcher has the same conversation with him in English, which he much more enthusiastically participates in.
The key difference that Bor points out is that when the man was answering in Mandarin, he did not understand the meaning and therefore did not  have consciousness. When he was answering in English, he clearly has consciousness and understanding because that is his native language. The book of Mandarin, similar to a computer, just contained guidelines, resulting in the lack of consciousness of computers.
This is a similar situation in the article "A Test for Consciousness" by Christof Koch and Giulio Tononi. While testing the consciousness of the computer, they mention the Turing test, like Bor had previously talked about. They again tell of how certain computer programs have been so effective as to carry on a computer chat with people without them realizing that they were talking with a computer. This leads the authors to question whether or not computers can have consciousness; they link in the "integrated information theory of consciousness." It is based on two theories, that consciousness is both highly informative and integrated. By saying that it is informative, they mean that the brain can rule out possibilities for certain situations depending on the signals that it is exposed to at the time. It is integrative in that the whole picture is absorbed, not just bits and pieces of it.
With these in mind, they use the Greek letter phi to determine what they call "the system's capacity for integrated information," which can also be called consciousness. Phi is considered low when elements are independent and not integrative and high when there are multiple connections between these elements.
This is like when Bor asserts that if we think of the brain as a computer, one neuron would only be able to perform one task, and that neuron would only connect to one or a few other neurons. In reality, neurons have multiple functions and are connected to thousands of other neurons, making it much easier to process information, such as a picture that is out of context. This is where the computer is not able to process. When shown the picture of a computer, keyboard, and mouse and then a computer, plant, and a mouse, it is not capable of distinguishing what is wrong. The human brain, through its consciousness and vast network of connections, is able to realize that the plant is out of place in this picture.
Potentially in the future, computers will be able to process the complex situations that humans face on a daily basis. But until then, computers will not function to the same capacity as the brain, with its many inputs and signals. Computers are coming ever closer to the reality of consciousness, but for now, they continue to act as tools of the human consciousness. 

Bor, Daniel. The Ravenous Brain: How the New Science of Consciousness Explains Our Insatiable Search for Meaning. Basic, 2012. Print.

Koch, Christof, and Giulio Tononi. "A Test for Consciousness." Scientificamerican.com. 1 June 2011. Web. 1 Oct. 2014. <http://www.scientificamerican.com/article/a-test-for-consciousness/>.

No comments:

Post a Comment