Keisuke Sehara 8675bebe9e add raw videos back | 3 سال پیش | |
---|---|---|
.. | ||
MLA-041630 | 3 سال پیش | |
S005-19 | 3 سال پیش | |
S006-19 | 3 سال پیش | |
SNA-079258 | 3 سال پیش | |
SNA-079259 | 3 سال پیش | |
SNA-079260 | 3 سال پیش | |
every10 | 3 سال پیش | |
LICENSE | 3 سال پیش | |
README.md | 3 سال پیش |
Raw videos, stored in the HDF5 format for the real-time DeepLabCut paper.
Each directory (except for every10
) corresponds to the ID of an animal (all are male C57BL6 adults).
train
sessions were acquired at 100 Hz without body-part estimation by DeepLabCut.
The frames from the videos were used to train deep-neural network models.
Body-part positions were estimated during test
sessions, using the model specifically
trained for the animal.
How body-part positions were evaluated for output generation was noted as the evaluation
attribute of the root entry of each HDF file.
The "every10" corresponds to the condition where acquisition was run without any animal, and trigger output was flipped after every 10 frames.
This condition was used to measure the timestamp-based inter-frame intervals being acquired with DeepLabCut-based pose estimation.
It is also possible to use it for validation of trigger output latency.
datacite.yml | |
---|---|
Title | Data for Sehara et al., 2021 eNeuro (the real-time DeepLabCut project) |
Authors |
Sehara,Keisuke;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0003-4368-8143
Zimmer-Harwood,Paul;Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom. Colomb,Julien;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520 Larkum,Matthew E.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-6627-0199 Sachdev,Robert N.S.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520 |
Description | Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e 10.5 millisecond. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity. |
License | Creative Commons 4.0 Attribution (https://creativecommons.org/licenses/by/4.0/) |
References |
Sehara K, Zimmer-Harwood P, Larkum ME, Sachdev RNS (2021) Real-time closed-loop feedback in behavioral time scales using DeepLabCut. [doi:10.1523/eneuro.0415-20.2021] (IsSupplementTo)
|
Funding |
EU, EU.670118
EU, EU.327654276 EU, EU.720270 EU, EU.785907 EU, EU.945539 DFG, DFG.250048060 DFG, DFG.246731133 DFG, DFG.267823436 |
Keywords |
Neuroscience
Behavioral tracking Closed-loop experiment system |
Resource Type |
Dataset |