Award details

High-resolution multimodal imaging for multisensory interactions

ReferenceBB/E012604/1
Principal Investigator / Supervisor Professor Zoe Kourtzi
Co-Investigators /
Co-Supervisors
Professor Andrew Bagshaw, Dr Stuart Derbyshire, Professor Glyn Humphreys, Professor Rowland Miall, Professor Alan Wing
Institution University of Birmingham
DepartmentSchool of Psychology
Funding typeResearch
Value (£) 65,691
StatusCompleted
TypeResearch Grant
Start date 01/04/2007
End date 31/03/2008
Duration12 months

Abstract

Understanding the link between brain structure, function and behaviour is a key question in cognitive neuroscience. Recent advances in functional magnetic resonance imaging (fMRI) provide powerful tools for studying this question in a non-invasive manner in the human brain. However, the nature of the BOLD signal measured by fMRI imposes limitations in the spatial and temporal resolution of this technique. To overcome these limitations, we propose multimodal imaging methods (high spatial resolution fMRI combined with high temporal resolution EEG) that allow us to characterise the neural dynamics and function of fine neural ensembles within cortical areas. In particular we aim to study the neural circuits that mediate multisensory (vision, audition, touch) and motor interactions in the normal, ageing and impaired human brain. First, we will develop high resolution fMRI methods using a flexible surface coil that will allow us to zoom into their neural processing of cortical regions of interest and characterise the distribution and function of different neural populations within these regions. Second, we will develop protocols for combined EEG-fMRI recordings in the scanner that allow the electrical and haemodynamic responses to be recorded simultaneously during the same session. Third, we will exploit and advance analysis methods for combined EEG-fMRI measurements based on statistical learning approaches and single trial analyses. Fourth, we will combine these high resolution imaging methods (EEG-fMRI) with multimodal stimulation (visual, auditory, tactile) for examining the link between multisensory perception and action in the human brain in real time. These combined high resolution EEG-fMRI methods provide a unique tool for studying neurovascular coupling non-invasively in the human brain and monitoring neural changes due to learning-based plasticity or cortical reorganisation in the intact or neuropsychologically impaired brain across the life span.

Summary

Recent developments in functional magnetic resonance imaging (fMRI) have transformed our understanding of the human brain, its sensory processing and cognitive functions by providing a means of localising the cortical regions that are involved in a wide variety of tasks. Despite this progress, the link between brain structure, function and behaviour that guides human actions and interactions remains a key unresolved issue in cognitive neuroscience. To address this issue we propose multimodal imaging methods (high resolution fMRI combined with EEG) that have complementary high spatial and temporal resolution and will allow us to study the human brain in real time action. In particular, the proposed research has the following main objectives. First, we will take advantage of the capabilities of the state-of-the-art MR scanner in our imaging centre (3T Phillips, Achieve) and develop high resolution fMRI protocols using a surface coil that will allow us to study the human brain at a higher spatial resolution, that is, at a finer scale of neural ensembles within cortical areas. Second, we will combine these high spatial resolution fMRI measurements with EEG recordings in the scanner that have high temporal resolution critical for studying the fast cognitive processes that guide our actions and interactions. Third, we will use these combined high resolution imaging methods (EEG-fMRI) to study the human brain in real time action. Our aim is to understand how the human brain integrates multiple sources of information from different sensory modalities (vision, audition, touch) and translates them to fast and successful actions. The existing MR compatible equipment allows visual and tactile stimulation and recordings of eye and limb movements. We will integrate these devices with the proposed equipment for auditory and heat stimulation to investigate the neural circuits that mediate the integration of sensory, proprioceptive and motor signals in the human brain that guide ourcategorical perception, attention, learning and actions in the complex environments we inhabit. Finally, we will extend the application of these methods and protocols to the study of multisensory and motor integration across the life span and in the adverse event of brain damage. We aim to examine the mechanisms that lead to rapid cognitive decline in some older adults while others maintain high levels of cognitive performance using sensitive tools for the measurement and analysis of age-related changes in behaviour, brain structure and neural function. Further, we will conduct high resolution imaging studies in patients with neuropsychological deficits that offer the potential of determining which areas are necessary for behaviour. We will examine the effects of brain lesions on activation in structurally intact regions of cortex, in order to gain new information about the functional connectivity between brain areas and potential cortical reorganisation that may support recovery of function in the impaired brain. As such this work has the potential to provide novel methods and findings that advance our understanding of the link between brain and behaviour and thus contribute to the general health and wealth as well the development of interdisciplinary and internationally competitive research in the UK.
Committee Closed Committee - Animal Sciences (AS)
Research TopicsX – not assigned to a current Research Topic
Research PriorityX – Research Priority information not available
Research Initiative Research Equipment Initiative 2006 (RE6) [2006]
Funding SchemeX – not Funded via a specific Funding Scheme
terms and conditions of use (opens in new window)
export PDF file