Synthetic brain imaging: grasping, mirror neurons and imitation




    The article contributes to the quest to relate global data on brain and behavior (e.g. from PET, Positron Emission Tomography, and fMRI, functional Magnetic Resonance Imaging) to the underpinning neural networks. Models tied to human brain imaging data often focus on a few “boxes” based on brain regions associated with exceptionally high blood flow, rather than analyzing the cooperative computation of multiple brain regions. For analysis directly at the level of such data, a schema-based model may be most appropriate. To further address neurophysiological data, the Synthetic PET imaging method uses computational models of biological neural circuitry based on animal data to predict and analyze the results of human PET studies. This technique makes use of the hypothesis that rCBF (regional cerebral blood flow) is correlated with the integrated synaptic activity in a localized brain region. We also describe the possible extension of the Synthetic PET method to fMRI. The second half of the paper then exemplifies this general research program with two case studies, one on visuo-motor processing for control of grasping (Section 3 in which the focus is on Synthetic PET) and the imitation of motor skills (Sections 4 and 5, with a focus on Synthetic fMRI). Our discussion of imitation pays particular attention to data on the mirror system in monkey (neural circuitry which allows the brain to recognize actions as well as execute them). Finally, Section 6 outlines the immense challenges in integrating models of different portions of the nervous system which address detailed neurophysiological data from studies of primates and other species; summarizes key issues for developing the methodology of Synthetic Brain Imaging; and shows how comparative neuroscience and evolutionary arguments will allow us to extend Synthetic Brain Imaging even to language and other cognitive functions for which few or no animal data are available. © 2000 Published by Elsevier Science Ltd.