November 2, 2010
Besides physics, my other primary area of research is related to visual cognition. Have you guys ever seen the Honda robot Asimo? I’m sure you’ll recognize it from the picture below.
Asimo is the most advanced robot in existence. I found an awesome lecture on Youtube by one of the scientists involved in Asimo’s creation. He’s talking about its visual system and how it works. It’s absolutely incredible. If you want to know what the subject of “visual cognition” is all about, jump to 20 minutes within this video and you’ll actually watch Asmio’s systems in action. You get to see out of its camera eyes and watch it process the raw information, track objects, and comprehend the environment.
This is an artificial robot doing the same things your brain does for you automatically. The most simplistic, everyday tasks you perform are actually mind boggling in complexity. The thing is, your brain processes all the information for you without you having to think about it, leaving you with a false sense of simplicity.
I’ve also found some interesting lectures on the same material online. In this next lecture by Dr. Scott Murray, you’ll find out that your visual system heavily processes the raw sensory information from your eyes long before it becomes conscious to you. This is actually rather cutting edge research because his information related to V1 is different than what my neuroscience textbooks tell me, and they’re only a few years old.
It’s amazing to learn how your brain judges the “size” of an object. Our brains also perform some rather strange feats as they process the color information they receive. By the time you see something, you brain has already greatly filtered and processed the information, fusing it together with the context around it. You’re only conscious of the end result, which is why this is very fascinating to me. Understanding all those intermediary steps is a passion of mine.
You’ll be amazed as Dr. Murray moves colored squares in and out of pictures. He’ll hold the square outside of a picture and say, “What color is this?” You’ll say, “It’s gray”. Then when he places it in the picture, it’ll turn blue right before your eyes. Is the square gray? Blue? I don’t know.
As you take a stroll through your backyard, you’d be blown away if you knew all the things your brain processes for you. Say you see a tree several yards away from you. You walk toward it, then away from it, then move around it and view it from an angle, and so on and so forth. Throughout all of those complex changes in sensory impressions, your brain has computed and inferred that there is a tree, and that between your movements, what you’re viewing is the same tree. It’s judged its location relative to other objects. Even when changes take place, such as the limbs bending, and the leaves falling from the sky, it still knows it’s the same tree. It logs memories of these experiences. It just does and sorts all of this information for you. It’s only when you have to design a robot to do all these things that you start to appreciate how complex you are.
I study the brain because it tells us who we really are. I believe the space we’re living within is way more complicated than what our brains are telling us. The space our brains work with is a product of natural selection, and is a rough approximation. Quantum mechanics and relativity point toward a deeper picture. I think to understand some of the more subtle mysteries, it’ll take a deep understanding of everything the brain does.
Physics is plagued by overly simplistic concepts such as “observers” and “objects”. There’s models like the “many-worlds” interpretation, and that each decision splits the universe. But what are these “decisions” they’re talking about? If you study neuroscience and the brain, you see that such conceptions are way too simple compared to what’s really going on.
Our brains are almost hard wired to decode the sensory information in a certain way, and using that system we set constrictions on our level of understanding. As Richard Dawkins is always saying, we’ve evolved to move at slow speeds and deal with medium sized objects. Once you start dealing with the universe at the large scale, such as cosmology, or the small scale (quantum physics), the “virtual reality” which our brains are decoding and immersing us within based on sensory impressions is inaccurate and has to be abandoned.
People talk about “free will” like it’s some ultimately simple thing. It’s not. Philosophers such as John Locke have felt we have control of our bodily movements. This simplicity is assumed by many physicists when they talk about our actions influencing and modifying various quantum mechanical experiments. But decisions are not so simple. Take this next lecture for instance. When you look into your visual system closely, you find out that you actually have two visual cognition systems. One is related to conscious awareness of what you’re seeing, whereas the other one plans bodily movements based on visual information. There are patients with brain damage who live with just one or the other, and it’s pretty strange.
When you do something as simple as picking up a pen off your desk, the calculations and computations your brain performs are almost unbelievable. Multiple brain systems go into operation and trillions of neurons are firing off. And if you think some simplistic “you” is in control of that, all it takes is a little brain damage to set you back in your proper place.
I’ll let Professor Goodale explain it. He does a wonderful job.