In these short videos, researchers use a glass brain and pulses of light to show how information flows between different regions of the brain.
The Glass Brain is kind of like it sounds: a colorful, three-dimensional window into the myriad mysterious activities that light up the brain. Aided by virtual reality technology, users can journey through a person’s brain in real time.
Of all the fascinating reasons researchers may want to take such a journey, neuroscientists are especially focused on using the Glass Brain technology to study diseases, including Alzheimer’s, autism and multiple sclerosis.
The Glass Brain technology was developed by UCSF researcher Adam Gazzaley and colleagues at UC San Diego.
Jyoti Mishra, an assistant professor of neurology and psychiatry in the Gazzaley lab, utilizes this technology to develop therapeutics aimed at improving cognitive function and reducing attention deficit disorder in children.
This is an anatomically-realistic 3D brain visualization depicting real-time source-localized activity (power and “effective” connectivity) from EEG (electroencephalographic) signals. Each color represents source power and connectivity in a different frequency band (theta, alpha, beta, gamma) and the golden lines are white matter anatomical fiber tracts. Estimated information transfer between brain regions is visualized as pulses of light flowing along the fiber tracts connecting the regions.
The modeling pipeline includes MRI (Magnetic Resonance Imaging) brain scanning to generate a high-resolution 3D model of an individual’s brain, skull, and scalp tissue, DTI (Diffusion Tensor Imaging) for reconstructing white matter tracts, and BCILAB (http://sccn.ucsd.edu/wiki/BCILAB) / SIFT (http://sccn.ucsd.edu/wiki/SIFT) to remove artifacts and statistically reconstruct the locations and dynamics (amplitude and multivariate Granger-causal interactions) of multiple sources of activity inside the brain from signals measured at electrodes on the scalp (in this demo, a 64-channel “wet” mobile system by Cognionics/BrainVision).
The final visualization is done in Unity and allows the user to fly around and through the brain with a gamepad while seeing real-time live brain activity from someone wearing an EEG cap.
More Information: University of California San Francisco