The data repository for Sehara et al., 2021 eNeuro, real-time DeepLabCut (Pose-Trigger) project.

Keisuke Sehara d3e67ddffb update docs 3 years ago
01_RawVideos 8675bebe9e add raw videos back 3 years ago
02_Spike2Recordings 7e3ce5c906 add spike2 recordings again 3 years ago
03_DLCProjects a00abd6701 add DLC projects back 3 years ago
04_PostHocEstimation d3f6aa63e8 add post-hoc estimation again 3 years ago
05_PerformanceProfiling 3174e51bf2 add unlinked files back 3 years ago
.gitattributes ffb06aec19 add untracked 3 years ago
.gitignore ffb06aec19 add untracked 3 years ago
LICENSE 4c54e69817 Initial commit 3 years ago
README.md d3e67ddffb update docs 3 years ago
datacite.yml d3e67ddffb update docs 3 years ago
setup.png d039baf75d add acquisition setup info 3 years ago

README.md

Data for Sehara et al., 2021 eNeuro (the real-time DeepLabCut project)

The data repository for: Sehara K, Zimmer-Harwood P, Larkum ME, Sachdev RNS (2021) Real-time closed-loop feedback in behavioral time scales using DeepLabCut. ENEURO.0415-20.2021.

Acquisition setup

(The image is licensed under CC-BY 4.0, 2020 Keisuke Sehara)

The head-fixed mouse was allowed to whisk freely under infrared illumination. The behavior of the mouse was captured from the above, using the ImagingSource DMK37BUX287 camera.

The Pose-Trigger program was used to capture images and to generate output triggers.

Datasets

You can find more details about each dataset in the README file of the corresponding subfolder.

  1. Raw videos: the raw videos acquired using Pose-Trigger. The files are converted to HDF5 files (instead of the original NumPy files).
  2. Spike2 recordings: recordings of the frame and trigger signals during Pose-Trigger acquisition, using Spike2. The files are converted to HDF5 files (instead of the original .smrx files).
  3. DeepLabCut projects: the DeepLabCut (v2.1) projects used in the study.
  4. Post hoc pose estimation: the post-hoc pose-estimation data to be compared with the real-time data. The files are in the HDF5 format (in the structure different from the "original" PyTables format that DeepLabCut generates).
  5. Performance profiling: the data and analytical procedures (and some figures) used to profile the speed and accuracy of Pose-Trigger.

License

Copyright (c) 2020 Keisuke Sehara, Paul Zimmer-Harwood, Matthew E. Larkum, and Robert N.S. Sachdev, Creative Commons Attribution 4.0 International (CC-BY 4.0).

datacite.yml
Title Data for Sehara et al., 2021 eNeuro (the real-time DeepLabCut project)
Authors Sehara,Keisuke;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0003-4368-8143
Zimmer-Harwood,Paul;Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom.
Colomb,Julien;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520
Larkum,Matthew E.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-6627-0199
Sachdev,Robert N.S.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520
Description Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e 10.5 millisecond. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.
License Creative Commons 4.0 Attribution (https://creativecommons.org/licenses/by/4.0/)
References Sehara K, Zimmer-Harwood P, Larkum ME, Sachdev RNS (2021) Real-time closed-loop feedback in behavioral time scales using DeepLabCut. [doi:10.1523/eneuro.0415-20.2021] (IsSupplementTo)
Funding EU, EU.670118
EU, EU.327654276
EU, EU.720270
EU, EU.785907
EU, EU.945539
DFG, DFG.250048060
DFG, DFG.246731133
DFG, DFG.267823436
Keywords Neuroscience
Behavioral tracking
Closed-loop experiment system
Resource Type Dataset