Keisuke Sehara 987c4ca6ad add performance profiling again 3 years ago
..
F01_trace-comparison 987c4ca6ad add performance profiling again 3 years ago
F02_densities 987c4ca6ad add performance profiling again 3 years ago
F03_conditional-probability 987c4ca6ad add performance profiling again 3 years ago
F04_summary 987c4ca6ad add performance profiling again 3 years ago
01_data-formatting.ipynb 987c4ca6ad add performance profiling again 3 years ago
02_summary.ipynb 987c4ca6ad add performance profiling again 3 years ago
README.md 987c4ca6ad add performance profiling again 3 years ago
analyzed-data.h5 987c4ca6ad add performance profiling again 3 years ago
stats.md 987c4ca6ad add performance profiling again 3 years ago

README.md

Real-time accuracy of trigger output generation

For each behavioral session with real-time trigger generation, we created a single DeepLabCut project dedicated for post hoc estimation of whisker tips to use it as the "ground-truth" data.

Based on this ground-truth dataset, we computed the conditional probability of trigger generation for the whiskers being located at each position to characterize the accuracy of trigger output signals.

1. Analyzed data

The analyzed-data.h5 HDF5 file contains infomation on each video.

1-1. Code

The code to generate this dataset is found in 01_data-formatting.ipynb, and the panels generated during the procedures are in:

  • F01_trace-comparison: comparison of traces between real-time and post-hoc estimation.
  • F02_densities: plots of dwell-time histograms used in the paper.
  • F03_conditional-probability: plots of conditional probability densities used in the paper.

1-2. Entries in analyzed-data.h5

Each video comprises one entry under the root, numbered as 001, 002, etc.

Attributes

Attributes of each entry contains the information of the video, such as:

  • subject: name of the animal.
  • session: name of the behavioral session.
  • run: the index / starting time stamp of the video.
  • expression: the expression used to turn the positions of whisker tips into the status of trigger output.
  • px_per_mm: the scale information computed from the video.

Subgroups

  1. "pose" group store the estimation of each whisker-tip positions. It contains two subgroups corresponding to realtime and posthoc estimations.
  2. "evaluation" group store the result of evaluation. Traces in realtime and posthoc store the values of expression at given time. The trigger data holds the output of the evaluation.
  3. "densities" group stores the dwell-time histograms generated using the kernel-density estimation method.
    • kernel: the attributes contain the information of the Gaussian kernel (in pixels) used for kernel density estimation.
    • positions: the list of positions (in pixels) used to estimate the dwell-time density.
    • all: the values of dwell-time densities, during the whole recording period of the video, based on the post hoc estimation.
    • triggered: the values of dwell-time densities, when the real-time trigger was active, based on the post hoc estimation.
  4. "conditional" group stores the information on conditional probability of trigger generation.
    • positions: the list of positions (in pixels) used to compute conditional probability.
    • probability: the values of conditional probability at given position.
    • sigmoid: the attributes (in pixels) contain the information on the sigmoid curve (we used the cumulative of Gaussian) fitted to the trace of conditional probability distribution.

2. Summary data

Several types of summary data are found here:

  • 02_summary.ipynb is the Jupyter notebook used to summarize the data in analyzed-data.h5.
  • F04_summary contains the panels generated by the above notebook (and used in the paper).
  • stats.md stores the statistical information analyzed in the above notebook.
datacite.yml
Title Data for Sehara et al., the real-time DeepLabCut project
Authors Sehara,Keisuke;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0003-4368-8143
Zimmer-Harwood,Paul;Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford OX1 3PT, United Kingdom.
Colomb,Julien;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID: 0000-0002-3127-5520
Larkum,Matthew E.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-6627-0199
Sachdev,Robert N.S.;Institut für Biologie, Humboldt Universität zu Berlin, Berlin, 10117 Germany.;ORCID:0000-0002-3127-5520
Description Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e 10.5 millisecond. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.
License Creative Commons 4.0 Attribution (https://creativecommons.org/licenses/by/4.0/)
References Sehara K, Zimmer-Harwood P, Larkum ME, Sachdev RNS (2021) Real-time closed-loop feedback in behavioral time scales using DeepLabCut. [] (IsSupplementTo)
Funding EU, EU.670118
EU, EU.327654276
EU, EU.720270
EU, EU.785907
EU, EU.945539
DFG, DFG.250048060
DFG, DFG.246731133
DFG, DFG.267823436
Keywords Neuroscience
Behavioral tracking
Closed-loop experiment system
Resource Type Dataset