Keisuke Sehara 3174e51bf2 add unlinked files back 3 years ago
..
figures 3174e51bf2 add unlinked files back 3 years ago
01_data-extraction.ipynb 987c4ca6ad add performance profiling again 3 years ago
02_summary.ipynb 987c4ca6ad add performance profiling again 3 years ago
README.md 987c4ca6ad add performance profiling again 3 years ago
latency_data.h5 987c4ca6ad add performance profiling again 3 years ago
latency_stats.tsv 987c4ca6ad add performance profiling again 3 years ago
latency_summary_26sessions.json 987c4ca6ad add performance profiling again 3 years ago
summary.tsv 987c4ca6ad add performance profiling again 3 years ago

README.md

Profiling of timings during real-time behavioral sessions

Using CED Power1401 and Spike2, we recorded the timings of each frame capture and the timings of trigger output generation.

1. Analyzed data

The latency-data.h5 HDF5 file contains infomation on each video.

1-1. Code

The code to generate this dataset is found in 01_data-extraction.ipynb, and the panels generated during the procedures are in the figures subdirectory.

1-2. Entries in latency-data.h5

There are three subgroups, frame_intervals, on_latency and off_latency, corresponding to the qualities analyzed. All the subgroups have the same structure.

One video comprises a list of values, and forms one dataset entry under each subgroup, numbered as 001, 002, etc.

Attributes of the entry contains the information of the video, such as:

  • subject: name of the animal.
  • session: name of the behavioral session.
  • run: the index corresponding to the run of Spike2 recording.
  • epoch: the index corresponding to the run of video recording during the Spike2 recording session.
  • has_trigger: whether or not this session involved real-time trigger-output generation.

2. Summary data

Several types of summary data are found here:

  • 02_summary.ipynb is the Jupyter notebook used to summarize the data in latency-data.h5.
  • latency_summary_26sessions.json contains the distribution of values during each video acquisition.
  • latency_stats.tsv is the distribution of each statistics across sessions.
  • summary.tsv must be identical to what we used as the table in the paper.
datacite.yml
Title Data for Sehara et al., the real-time DeepLabCut project
Authors Sehara,Keisuke;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0003-4368-8143
Zimmer-Harwood,Paul;Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom.
Colomb,Julien;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID: 0000-0002-3127-5520
Larkum,Matthew E.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-6627-0199
Sachdev,Robert N.S.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520
Description Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e 10.5 millisecond. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.
License Creative Commons 4.0 Attribution (https://creativecommons.org/licenses/by/4.0/)
References Sehara K, Zimmer-Harwood P, Larkum ME, Sachdev RNS (2021) Real-time closed-loop feedback in behavioral time scales using DeepLabCut. [] (IsSupplementTo)
Funding EU, EU.670118
EU, EU.327654276
EU, EU.720270
EU, EU.785907
EU, EU.945539
DFG, DFG.250048060
DFG, DFG.246731133
DFG, DFG.267823436
Keywords Neuroscience
Behavioral tracking
Closed-loop experiment system
Resource Type Dataset