All posts by Stephanie Baines

Stephanie Baines undertook a PhD in Cognitive Neuroscience at Oxford University. Now a Teaching Fellow at University College London (UCL), she lectures and researches matters such as how reward and emotion influence neural function and behaviour, how we use new information to learn, make decisions and behave, and how unconscious and conscious information influence cognition. A caffeine and chocolate addict, when not pondering brains she loves literature, photography and architecture. You can contact her via Twitter at @neuronerdSB

What is consciousness?

Conscious brain 730Since time immemorial, one important question has perplexed the greatest minds: what is consciousness? It was once a question that only philosophers would dare try to answer, but scientific advances – specifically in the field of cognitive neuroscience – are allowing us to take a scientific look inside the ‘black box’ of consciousness.

Modern techniques such as functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) now let us examine our brain as we process and interact with our surroundings. Each event triggers a cascade of electrical and chemical processes, but much of what goes on in our brain we are unaware of. For example, when you look at a picture of a landscape, are you aware of the computations being performed by your brain to determine where the horizon is, what the shapes represent and how you identify birds, trees and sky?

We have both conscious and unconscious awareness, but in reality only a tiny proportion of the brain’s processes reach our conscious awareness. One line of evidence suggests that conscious and unconscious thoughts are actually very similar, but at different volumes. We may hear something, for example, and there is activity in the region of the brain responsible for processing sound. We may become aware of that sound, and the activity in that region becomes stronger. It is a bit like the brightness of a light with a dimmer switch – the light is on for both unconscious and consciously detected items, but on a low setting for the former, and turned up to a brighter setting with the latter.

Other research suggests that different brain regions, or different groups of brain cells are responsible for unconscious and conscious processing. One team of researchers have shown that in the first 275 milliseconds of being shown an image, only brain cells in the visual cortex are activated (at the back of the brain). The visual cortex must then send the information to the front of the brain to the ‘thinking’ regions (the prefrontal cortex) before the image enters conscious thoughts.

But if so many processes can continue unconsciously, why do we even need consciousness? Conscious thoughts are costly. They demand a lot of the brain’s processing resources and are slow. ‘Zombie systems’ exist that typically operate with very stereotyped, fixed and simple responses – such as removing your hand from a hot element on the stove. The answer is that consciousness offers us much more: a repertoire of responses that allow us to learn from our experiences and adapt to what comes our way.

We receive a huge amount of sensory information at any given moment. If we were to process every piece of information, our brain’s processing resources would be so clogged up that by the time we had conscious thoughts about the tiger running towards us, it would have eaten us! Despite this, we have the illusion of perceiving everything around us in a completely unified way. Different pieces of information from a single sense – such as colour, shape, and location – are combined. The information from each sense is then combined to form a ‘multisensory representation’ that we are conscious of.

Uncertainty over how these multisensory representations are formed into a coherent ‘consciousness’ is called the ‘binding problem’. Many cognitive scientists believe that specific sets of neurones exist that ‘bind’ all these different sensations into one. However, the identity and existence of these sets of neurones is a topic of great speculation. For example, Edelman and Tononi claim to have identified a ‘dynamic core’ consisting of neurons connecting the thalamus (the brain’s relay station for perceptual information) and the primary sensory cortices (the main cerebral regions that receive input from the thalamus) as the main ‘binding’ substance. Lamme, on the other hand, suggests that binding occurs when different regions of the cortex ‘fire’ at the same time (i.e. they become ‘synchronously active’). Whatever the exact process, it is widely accepted that cross-talk between different parts of the brain is vital for binding to occur and consciousness to emerge. Or as Lamme paraphrases Descartes: “I bind, therefore I am.”

By Stephanie Baines

Mary McMahon. (February 2015). What is a spinal reflex? Wisegeek. http://www.wisegeek.org/what-is-a-spinal-reflex.htm. (Accessed 15 February 2015).

Sergent, C. et al. (2005). Timing of the brain events underlying access to consciousness during the attention blink. Nature Neuroscience, 8: 1391-1400.

Edelman, G. and Tononi, G. (2000). Consciousness: How Matter Becomes Imagination. New York: Basic Books.

 

Photo Credit: GreenFlames09 via Compfight cc

Article by Stephanie Baines

February 27, 2015

Stephanie Baines undertook a PhD in Cognitive Neuroscience at Oxford University. Now a Teaching Fellow at University College London (UCL), she lectures and researches matters such as how reward and emotion influence neural function and behaviour, how we use new information to learn, make decisions and behave, and how unconscious and conscious information influence cognition. A caffeine and chocolate addict, when not pondering brains she loves literature, photography and architecture. You can contact her via Twitter at @neuronerdSB


Back To Top

Why can I remember historical facts but not where I left my keys?

Why can't I remember where I left my keysWhy is it that I can remember dates in history, battles , reigns, birthdays, etc but I can’t remember where I put my keys or phone? (or other things that I did recently!)

A familiar scenario in films and novels has a character who stumbles into the action, with no knowledge of who they are (Memento, anyone?). Suffering from amnesia, the person has a lack of memory relating to personal events and information from their past. Yet, sit this person down at the dining table and they still possess the knowledge of how to feed themselves, which hand to use for knife and fork, what the food items are, and so forth. Strange, isn’t it?

But it wouldn’t just be eating. Had they learned it before the amnesia struck, they would be able to tell you that the Nile is located in Africa, or that Abraham Lincoln was the sixteenth American president. This ability to remember is owing to the multifaceted nature of memory.

Your memory is made of many parts

Henry_VIII_(6)_by_Hans_Holbein_the_YoungerOur memory is not a single, unitary thing, but rather consists of different divisions. Just as you might be better at one sport than another, or a good cook but a disastrous baker, you might be stronger in one form of memory than another. Your memory for dates, birthdays, and other stark facts and figures is what is known as your semantic memory. Items in semantic memory are divorced from context, so whilst you might be able to remember the names of all six of Henry VIII’s wives, you are unlikely to remember the context in which you learned them – where you were or how you felt at the time. Many people find rote learning and repetition of information an effective way to store and later retrieve this information.

Semantic memory is different to our memory for the events that are personal to our own lives – the form of memory you might draw upon when you remember what you did last weekend or on a recent holiday. This is known as episodic memory. Rather than isolated facts, episodic memories are associated with a specific context or situation. So when you call up an item in episodic memory, it comes complete with a time and place, as well as associated emotions. You might relive the joy you felt upon receiving an award or the distress or fear of a negative event. Remembering where you left your keys requires conjuring up this richer form of memory.

Importantly, the more you think about an event stored in episodic memory – a process known as rehearsal – the more strongly that item becomes represented in your long-term episodic storage, i.e. the more you relive an event, the longer it will stay with you. There is much debate amongst researchers over whether this is due to creating more pathways in the brain to retrieve that item, more mental representations of that item, or some other factor(s). Whatever the reason may be, memories that have occurred further in the past are often rehearsed more frequently. Think about your family’s anecdotes – there are always some stories everyone knows off by heart, simply because they are retold at every family gathering!

Time and tide wash away your memories

Memories washed awayThis all means that recent events are likely to be more weakly stored in your memory than those in the more distant past. But time also plays another role in what information you remember and what you forget. With the passage of time, information can be lost or altered – a process called ‘decay’. This decay is most prominent in the first few days after the acquisition of information; therefore details such as where you have put your keys are most easily lost immediately after the event. After this initial loss, there is a much smaller loss of information – a process termed transience. So once information has successfully avoided the decay of that initial ‘danger period’, and successfully made it into your long-term memory store, it is more likely to avoid being forgotten.

To explain a little further, information we receive is processed and introduced, or ‘encoded’, into your memory store. Information or experiences are initially processed in short-term memory – a temporary store that is highly subject to disruption. From short-term memory, some information that is deemed important or relevant enough, is transferred into long-term memory. This is permanent and far more stable than short-term memory. So while your long-term memories aren’t infallible (they can be altered by new information – but that’s a different story!), once information is stored in your long-term memory, you are far more likely to remember it in future.

Finally, and importantly, emotion can strengthen the representation and retrieval of episodic memories. Where you put your keys is unlikely to be associated with much, if any, emotion (even though trying to find them later on might be associated with increasing anxiety!). The recent and unemotional nature of putting your keys on the kitchen worktop conspire to make “where did I put my keys?” an easily forgettable thing; whereas, all that repetition during your school days may just ensure that you never forget which year the Magna Carta was signed.

Now, when was that again…?

Answer by Stephanie Baines

Image credit: Kit, Dave King on flickr

Article by Stephanie Baines

July 31, 2014

Stephanie Baines undertook a PhD in Cognitive Neuroscience at Oxford University. Now a Teaching Fellow at University College London (UCL), she lectures and researches matters such as how reward and emotion influence neural function and behaviour, how we use new information to learn, make decisions and behave, and how unconscious and conscious information influence cognition. A caffeine and chocolate addict, when not pondering brains she loves literature, photography and architecture. You can contact her via Twitter at @neuronerdSB


Back To Top

How far away are we from a brain-computer interfaces?

Brain Computer Interface / g-tech medical engineering by Ars Electronica, on Flickr

Using your brain to control computers or objects may evoke images of Professor Xavier of the X Men, scanners, or creepy children causing others to burst into flames – like in Firestarter. Long thought of as dwelling only in the distant annals of science fiction, brain-computer interfaces (BCIs) are not as far away as you might think. In fact, researchers have already begun using our brain activity to do some remarkable things.

By placing electrodes in the brain or on the surface of the scalp, the electrical impulses produced by firing neurons (brain cells) can be detected. This so-called “electroencephalography” can be harnessed for use in the BCI. Monkeys and humans can learn to control a cursor on a screen through ‘biofeedback’ – they receive feedback from the computer, so they can learn how it feels to induce cursor movement in a certain way. It’s a bit like learning to ride a bike. With considerable practice, individuals can subsequently control the cursor movement.

More recently, neural signals from the motor area of the brain, which is responsible for our deliberate movements, have been used to control robotic arms. Tetraplegic patients unable to move their limbs were trained to imagine the movement of their own arm. A computer translated their patterns of brain activity across individual brain cells into movement commands – much like one might decode and translate text in a foreign language. These motor commands were then used to induce the imagined movements in a robotic arm, allowing one patient to grasp a thermos of coffee, move it to her mouth and take a drink. You can read about this study here. This is incredibly exciting, as it means that in future computers could help us recover the function of lost brain areas.

BCI is also becoming commercially available. Companies such as Samsung are developing BCIs that allow users to control their smartphone or tablet through the brain. Companies have also developed games that are controlled by BCI. These devices typically use signals that require gamers to concentrate in a certain way, or imagine certain images or events. If you focus in the right way then you can chase zombies, dodge bullets or play Tetris! A helicopter that can be flown by BCI has also been developed; users control the helicopter’s direction by movements of their hands. This produces distinct patterns of activity in the motor brain area, which can be picked up by a computer, subsequently guiding the helicopter’s flight direction.

Brain-brain interfaces? Controlling each other’s brain!

The realm of BCI has recently been expanded – brains could soon control each other through computers! Basically, one person moves a cursor on a screen through mind control, the computer turns this back into a brain signal and transfers it to another computer via the internet. This second computer connects to a brain stimulation device which targets the motor area of a second individual, causing him/her to twitch his finger, thus pressing the button on a computer keyboard. Thus an intended action in one person is turned into a performed action in another – via the internet! You can read about this study here. (Professor Xavier would be proud! – Ed)

We must keep in mind that communication and control of mobile or physical devices by thought alone is still far from trouble free. EEG signals are ‘noisy’ and difficult to detect without wearing bulky caps, sticky gel on the scalp, or implantation into the brain itself. It also takes a considerable amount of training for individuals to learn how to control cursors and electronic devices with their mind – and some find the task impossible. Still, thanks to the exciting potential of BCI, there’s probably lots more innovations to come.

 

Question from Mad Moules via Facebook

Answer by Stephanie Baines

Article by Stephanie Baines

October 11, 2013

Stephanie Baines undertook a PhD in Cognitive Neuroscience at Oxford University. Now a Teaching Fellow at University College London (UCL), she lectures and researches matters such as how reward and emotion influence neural function and behaviour, how we use new information to learn, make decisions and behave, and how unconscious and conscious information influence cognition. A caffeine and chocolate addict, when not pondering brains she loves literature, photography and architecture. You can contact her via Twitter at @neuronerdSB


Back To Top