Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NWB conversion and testing functions #17

Merged
merged 17 commits into from
Jan 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,21 @@ Loading data is simple with the ``snub.io`` module. For example the following co
snub.io.add_splikeplot(project_directory, 'my_ephys_data', spike_times, spike_labels)


We also support automatic conversion of NWB files to SNUB projects for a limited set of NWB neurodata types.


SNUB Documentation
------------------


.. toctree::
:maxdepth: 2

install

tutorials

nwb

snub.io<api>


131 changes: 131 additions & 0 deletions docs/source/nwb.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
Neurodata Without Borders
=========================

We provide a rudimentary tool for automatically generating a SNUB project from NWB files, which contain raw and processed data from neuroscience recordings. The data are stored hierarchically, and each component of the hierarchy has a specific neurodata type that reflects the measurement modality (e.g ``Units`` for spike trains, ``ImageSeries`` for video, etc.). Our conversion tool generates a SNUB subplot for each supported neurodata type. Users can optionally restrict this process to a subset of the NWB hierarchy (e.g. include pose tracking while excluding electrophysiology, or include just a subset of electrophysiology measurements).


Neurodata types
---------------

The following neurodata types are supported:

- ``IntervalSeries``
Contains start and end times for (possibly labeled) intervals. A SNUB trace plot is generated containing one trace per interval type.

- ``RoiResponseSeries``
Contains fluorescence traces for regions of interest (ROIs). A SNUB heatmap is generated containing one row per ROI. Metadata associated with each ROI is not linked in the SNUB plot.

- ``TimeSeries``
Contains time series in one or more dimensions. A SNUB heatmap is generated for 15 or more dimensions, and a SNUB trace plot is generaed for fewer than 15 dimensions.

- ``PoseEstimation``
Contains pose tracking data (available via the ``ndx-pose`` extension). A SNUB trace plot is generated for each tracked body part and spatial dimension. For 3D data, a 3D pose plot is also generated.

- ``ImageSeries``
Contains video data. We assume that the video is stored as a separate file and that the ``ImageSeries`` object contains frame timestamps and a relative path to that file. A SNUB video plot is then generated.

- ``LabelSeries``
Contains discrete label time series in the form of a binary matrix with one column per label abd one row per time bin (available via the ``ndx-labels`` extension). A SNUB heatmap is generated directly from this matrix.

- ``TimeIntervals``
Contains annotated intervals. Each interval has a start time, a stop time, and an arbitrary number of additional metadata fields. A SNUB trace plot is generated with one trace showing the start and stop times of each interval. All other metadata is ignored since it cannot be canonically represented using the currently available SNUB plot types.

- ``Position``
Contains position data in the form of one or more ``SpatialSeries`` objects. A SNUB trace plot is generated with traces for each spatial dimensions of each consistuent spatial series.

- ``SpatialSeries``
Contains spatial data in the form of a time series with one or more dimensions. A standalone SNUB trace plot is generated for the spatial series if it is not contained within a ``Position`` object.

- ``Units``
Contains spike trains for one or more units. A corresponding SNUB spike plot is generated.

- ``Events``
Contains a sequence of unlabeled event times (available via the ``ndx-events`` extension). A SNUB trace plot is generated with a single trace that spikes at each event time.


Examples
--------

For each example, run the first code block in a terminal and the second in a python console or notebook.


A change in behavioral state switches the pattern of motor output that underlies rhythmic head and orofacial movements
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

*Liao, Song-Mao; Kleinfeld, David; Rinehart, Duane; University of California San Diego (2023) Dataset for: A change in behavioral state switches the pattern of motor output that underlies rhythmic head and orofacial movements (Version 0.230515.0530) [Data set]. DANDI archive. https://doi.org/10.48324/dandi.000540/0.230515.0530*

Includes the following SNUB-compatible neurodata types: ``TimeSeries``, ``ImageSeries``

.. code-block:: bash

# Download NWB file
dandi download https://api.dandiarchive.org/api/dandisets/000540/versions/0.230515.0530/assets/94307bee-459c-424e-b3a0-1e86b23f04b2/download/

# Download associated video and create directory for it
dandi download https://api.dandiarchive.org/api/dandisets/000540/versions/0.230515.0530/assets/942b0806-2c8b-4289-a072-9e965884fcb6/download/
mkdir sub-SLR087_ses-20180706_obj-14ua2bs_behavior+image
mv 9557b48e-46f0-45f2-a700-a2e15318c5bc_external_file_0.avi sub-SLR087_ses-20180706_obj-14ua2bs_behavior+image/


.. code-block:: python

import os, snub

# Define paths
nwb_file = "sub-SLR087_ses-20180706_obj-14ua2bs_behavior+image.nwb"
name = os.path.splitext(os.path.basename(nwb_file))[0]
project_directory = os.path.join(os.path.dirname(nwb_file), f"SNUB-{name}")

# Make SNUB plot that includes video and torso tracking
snub.io.create_project_from_nwb(project_directory, nwb_file, branches=['torso_dlc', 'ImageSeries'])


A Unified Framework for Dopamine Signals across Timescales
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

*Kim, HyungGoo; Malik, Athar; Mikhael, John; Bech, Pol; Tsutsui-Kimura, Iku; Sun, Fangmiao; Zhang, Yajun; Li, Yulong; Watabe-Uchida, Mitsuko; Gershman, Samuel; Uchida, Naoshige (2023) A Unified Framework for Dopamine Signals across Timescales (Version draft) [Data set]. DANDI archive. https://dandiarchive.org/dandiset/000251/draft*

Includes the following SNUB-compatible neurodata types: ``TimeSeries``, ``TimeIntervals``, ``SpatialSeries``, ``Events``

.. code-block:: bash

# Download NWB file
dandi download https://api.dandiarchive.org/api/dandisets/000251/versions/draft/assets/b28fcb84-2e23-472c-913c-383151bc58ef/download/


.. code-block:: python

import os, snub

# Define paths
nwb_file = "sub-108_ses-Ca-VS-VR-2.nwb"
name = os.path.splitext(os.path.basename(nwb_file))[0]
project_directory = os.path.join(os.path.dirname(nwb_file), f"SNUB-{name}")

# Make SNUB plot
snub.io.create_project_from_nwb(project_directory, nwb_file)


Neural population dynamics during reaching
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

*Churchland, Mark; Cunningham, John P.; Kaufman, Matthew T.; Foster, Justin D.; Nuyujukian, Paul; Ryu, Stephen I.; Shenoy, Krishna V. (2022) Neural population dynamics during reaching (Version draft) [Data set]. DANDI archive. https://dandiarchive.org/dandiset/000070/draft*

Includes the following SNUB-compatible neurodata types: ``Units``, ``TimeIntervals``, ``Position``

.. code-block:: bash

# Download NWB file
dandi download https://api.dandiarchive.org/api/dandisets/000070/versions/draft/assets/7b95fe3a-c859-4406-b80d-e50bad775d01/download/

.. code-block:: python

import os, snub

# Define paths
nwb_file = "sub-Jenkins_ses-20090912_behavior+ecephys.nwb"
name = os.path.splitext(os.path.basename(nwb_file))[0]
project_directory = os.path.join(os.path.dirname(nwb_file), f"SNUB-{name}")

# Make SNUB plot
snub.io.create_project_from_nwb(project_directory, nwb_file)
173 changes: 102 additions & 71 deletions paper.bib
Original file line number Diff line number Diff line change
@@ -1,93 +1,124 @@
@software{vispy,
author = {Luke Campagnola and
Eric Larson and
Almar Klein and
David Hoese and
Siddharth and
Cyrille Rossant and
Adam Griffiths and
Nicolas P. Rougier and
asnt and
Kai Mühlbauer and
Alexander Taylor and
MSS and
Talley Lambert and
sylm21 and
Alex J. Champandard and
Max Hunter and
Thomas Robitaille and
Mustafa Furkan Kaptan and
Elliott Sales de Andrade and
Karl Czajkowski and
Lorenzo Gaifas and
Alessandro Bacchini and
Guillaume Favelier and
Etienne Combrisson and
ThenTech and
fschill and
Mark Harfouche and
Michael Aye and
Casper van Elteren and
Cedric GESTES},
title = {vispy/vispy: Version 0.11.0},
month = jul,
year = 2022,
publisher = {Zenodo},
version = {v0.11.0},
doi = {10.5281/zenodo.6795163},
url = {https://doi.org/10.5281/zenodo.6795163}
author = {Luke Campagnola and
Eric Larson and
Almar Klein and
David Hoese and
Siddharth and
Cyrille Rossant and
Adam Griffiths and
Nicolas P. Rougier and
asnt and
Kai Mühlbauer and
Alexander Taylor and
MSS and
Talley Lambert and
sylm21 and
Alex J. Champandard and
Max Hunter and
Thomas Robitaille and
Mustafa Furkan Kaptan and
Elliott Sales de Andrade and
Karl Czajkowski and
Lorenzo Gaifas and
Alessandro Bacchini and
Guillaume Favelier and
Etienne Combrisson and
ThenTech and
fschill and
Mark Harfouche and
Michael Aye and
Casper van Elteren and
Cedric GESTES},
title = {vispy/vispy: Version 0.11.0},
month = jul,
year = 2022,
publisher = {Zenodo},
version = {v0.11.0},
doi = {10.5281/zenodo.6795163},
url = {https://doi.org/10.5281/zenodo.6795163}
}



@article{bento,
abstract = {The study of naturalistic social behavior requires quantification of animals' interactions. This is generally done through manual annotation---a highly time-consuming and tedious process. Recent advances in computer vision enable tracking the pose (posture) of freely behaving animals. However, automatically and accurately classifying complex social behaviors remains technically challenging. We introduce the Mouse Action Recognition System (MARS), an automated pipeline for pose estimation and behavior quantification in pairs of freely interacting mice. We compare MARS's annotations to human annotations and find that MARS's pose estimation and behavior classification achieve human-level performance. We also release the pose and annotation datasets used to train MARS to serve as community benchmarks and resources. Finally, we introduce the Behavior Ensemble and Neural Trajectory Observatory (BENTO), a graphical user interface for analysis of multimodal neuroscience datasets. Together, MARS and BENTO provide an end-to-end pipeline for behavior data extraction and analysis in a package that is user-friendly and easily modifiable.},
article_type = {journal},
author = {Segalin, Cristina and Williams, Jalani and Karigo, Tomomi and Hui, May and Zelikowsky, Moriel and Sun, Jennifer J and Perona, Pietro and Anderson, David J and Kennedy, Ann},
citation = {eLife 2021;10:e63720},
abstract = {The study of naturalistic social behavior requires quantification of animals' interactions. This is generally done through manual annotation---a highly time-consuming and tedious process. Recent advances in computer vision enable tracking the pose (posture) of freely behaving animals. However, automatically and accurately classifying complex social behaviors remains technically challenging. We introduce the Mouse Action Recognition System (MARS), an automated pipeline for pose estimation and behavior quantification in pairs of freely interacting mice. We compare MARS's annotations to human annotations and find that MARS's pose estimation and behavior classification achieve human-level performance. We also release the pose and annotation datasets used to train MARS to serve as community benchmarks and resources. Finally, we introduce the Behavior Ensemble and Neural Trajectory Observatory (BENTO), a graphical user interface for analysis of multimodal neuroscience datasets. Together, MARS and BENTO provide an end-to-end pipeline for behavior data extraction and analysis in a package that is user-friendly and easily modifiable.},
article_type = {journal},
author = {Segalin, Cristina and Williams, Jalani and Karigo, Tomomi and Hui, May and Zelikowsky, Moriel and Sun, Jennifer J and Perona, Pietro and Anderson, David J and Kennedy, Ann},
citation = {eLife 2021;10:e63720},
date-modified = {2022-10-05 13:18:49 -0400},
doi = {10.7554/eLife.63720},
editor = {Berman, Gordon J and Wassum, Kate M and Gal, Asaf},
issn = {2050-084X},
journal = {eLife},
keywords = {social behavior, pose estimation, machine learning, computer vision, microendoscopic imaging, software},
month = {nov},
pages = {e63720},
pub_date = {2021-11-30},
publisher = {eLife Sciences Publications, Ltd},
title = {The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice},
url = {https://doi.org/10.7554/eLife.63720},
volume = 10,
year = 2021,
bdsk-url-1 = {https://doi.org/10.7554/eLife.63720}}
doi = {10.7554/eLife.63720},
editor = {Berman, Gordon J and Wassum, Kate M and Gal, Asaf},
issn = {2050-084X},
journal = {eLife},
keywords = {social behavior, pose estimation, machine learning, computer vision, microendoscopic imaging, software},
month = {nov},
pages = {e63720},
pub_date = {2021-11-30},
publisher = {eLife Sciences Publications, Ltd},
title = {The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice},
url = {https://doi.org/10.7554/eLife.63720},
volume = 10,
year = 2021,
bdsk-url-1 = {https://doi.org/10.7554/eLife.63720}
}


@misc{rastermap,
author = {C. Stringer and M. Pachitariu},
title = {rastermap},
year = {2020},
author = {C. Stringer and M. Pachitariu},
title = {rastermap},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
url = {https://github.com/MouseLand/rastermap}
journal = {GitHub repository},
url = {https://github.com/MouseLand/rastermap}
}

@misc{vidio,
author = {J. Bohnslav},
title = {VidIO: simple, performant video reading and writing in python},
year = {2020},
author = {J. Bohnslav},
title = {VidIO: simple, performant video reading and writing in python},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
url = {https://github.com/jbohnslav/vidio}
journal = {GitHub repository},
url = {https://github.com/jbohnslav/vidio}
}

@misc{umap,
doi = {10.48550/ARXIV.1802.03426},
url = {https://arxiv.org/abs/1802.03426},
author = {McInnes, Leland and Healy, John and Melville, James},
keywords = {Machine Learning (stat.ML), Computational Geometry (cs.CG), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction},
doi = {10.48550/ARXIV.1802.03426},
url = {https://arxiv.org/abs/1802.03426},
author = {McInnes, Leland and Healy, John and Melville, James},
keywords = {Machine Learning (stat.ML), Computational Geometry (cs.CG), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction},
publisher = {arXiv},
year = {2018},
year = {2018},
copyright = {arXiv.org perpetual, non-exclusive license}
}

@misc{petrucco_2020_3925903,
author = {Petrucco, Luigi},
title = {Mouse head schema},
month = jul,
year = 2020,
publisher = {Zenodo},
doi = {10.5281/zenodo.3925903},
url = {https://doi.org/10.5281/zenodo.3925903}
}


@article{NWB,
article_type = {journal},
title = {The Neurodata Without Borders ecosystem for neurophysiological data science},
author = {Rübel, Oliver and Tritt, Andrew and Ly, Ryan and Dichter, Benjamin K and Ghosh, Satrajit and Niu, Lawrence and Baker, Pamela and Soltesz, Ivan and Ng, Lydia and Svoboda, Karel and Frank, Loren and Bouchard, Kristofer E},
editor = {Colgin, Laura L and Jadhav, Shantanu P},
volume = 11,
year = 2022,
month = {oct},
pub_date = {2022-10-04},
pages = {e78362},
citation = {eLife 2022;11:e78362},
doi = {10.7554/eLife.78362},
url = {https://doi.org/10.7554/eLife.78362},
abstract = {The neurophysiology of cells and tissues are monitored electrophysiologically and optically in diverse experiments and species, ranging from flies to humans. Understanding the brain requires integration of data across this diversity, and thus these data must be findable, accessible, interoperable, and reusable (FAIR). This requires a standard language for data and metadata that can coevolve with neuroscience. We describe design and implementation principles for a language for neurophysiology data. Our open-source software (Neurodata Without Borders, NWB) defines and modularizes the interdependent, yet separable, components of a data language. We demonstrate NWB’s impact through unified description of neurophysiology data across diverse modalities and species. NWB exists in an ecosystem, which includes data management, analysis, visualization, and archive tools. Thus, the NWB data language enables reproduction, interchange, and reuse of diverse neurophysiology data. More broadly, the design principles of NWB are generally applicable to enhance discovery across biology through data FAIRness.},
keywords = {Neurophysiology, data ecosystem, data language, data standard, FAIR data, archive},
journal = {eLife},
issn = {2050-084X},
publisher = {eLife Sciences Publications, Ltd}
}
Loading
Loading