Elaboration of a high-throughput toolbox for analysis and visualization of mesoscale longitudinal cortical imaging dataset - Year two

In recent years, a dramatic increase in computer processing power and storage allowed the exploration of new frontiers in science by analyzing large, complex datasets. The field of neuroscience is evolving towards larger and more complex studies creating a huge amount of data that is hard to understand using classical analyses. Thus, the development of tools able to process voluminous datasets is necessary to advance our understanding of brain function.

Mise-au-point d’une tache oculomotrice intégrée aux approches de neurophotonique chez la souris

Ce projet a pour but de développer un dispositif permettant d’entraîner les souris de laboratoire à réaliser des mouvements des yeux dans différentes directions. Ce dispositif sera combiné à l’imagerie fonctionnelle et sera utilisé pour la recherche en neuroscience afin d’explorer des pathologies telles que les accidents vasculaires cérébraux ou de développer des neuroprothèses. Ce dispositif utilisera des approches innovantes de suivi des mouvements oculaires grâce à l’intelligence artificielle. Il utilisera également des dispositifs de projection immersive sur un écran hémisphérique.

Elaboration of a high-throughput toolbox for analysis and visualization of mesoscale longitudinal cortical imaging dataset

In recent years, a dramatic increase in computer processing power and storage allowed the exploration of new frontiers in science by analyzing large, complex datasets. The field of neuroscience is evolving towards larger and more complex studies creating a huge amount of data that is hard to understand using classical analyses. Thus, the development of tools able to process voluminous datasets is necessary to advance our understanding of brain function.

Elaboration of a high-throughput toolbox for analysis and visualization of mesoscale longitudinal cortical imaging dataset

The analysis of large, complex data helps scientists from different fields (e.g. physics, chemistry, biology) to provide new insights on their research. The field of neuroscience is no different. Larger and more complex studies which create huge amount of data are more and more common. Thus, the development of tools capable of processing large datasets is necessary to further advance our understanding of the brain function. Despite some limited efforts to address this issue, there is a lack of software resources capable of dealing with these datasets.