# Raw data for: Bergmann, Sehara _et al._, 2022 The raw-data repository for the project "AirtrackMotorPlanning" (examines whether whiskers and pupil positions predict the animal's motor plans) The dataset stored here is part of the [root data repository](https://gin.g-node.org/larkumlab/Bergmann_2022_DataRepository). Note that this dataset contains the un-aligned, unprocessed data. If you intend to look at the 'final' dataset (processed and aligned), refer to the [formatted dataset](https://gin.g-node.org/larkumlab/Ronny_Bergmann_AirtrackMotorPlanning_FormattedData). ### Contents - [File organization](#file-organization) - [Acquisition](#acquisition) - [Behavioral state annotation](#behavioral-state-annotation) ## File organization ### File names The files are organized as: `///___run`, where: - `ANIMAL` representing the ID of the animal (in the pattern `MLAxxxxxx`). - `SESSION` being the name of the session (in the pattern `sessionYYYY-MM-DD-001`). - `DOMAIN` representing the domain of recording (see below). - `TRIAL` being the trial index during the session. ### Domain names | Domain name | Description | | ----------- | ------------------------------------------------------------ | | top | AVI files taken at 200 Hz using ZR View (a custom video acquisition software) under the near-UV lighting, trial-based. See [acquisition](#acquisition) for more details. | | states | CSV files containing manual annotation of behavioral states based on the top-view videos, trial-based. See [behavioral state annotation](#behavioral-state-annotation) for more details. | | left-eye | AVI files of left eye videos taken at 100 Hz using PylonPD (a custom video acquisition software) under the IR-lighting, trial-based. | | right-eye | AVI files of right eye videos taken at 100 Hz using PylonPD (a custom video acquisition software) under the IR-lighting, trial-based. | ## Acquisition Below is the schematics of acquisition. The "Start/Stop" signal from the Arduino Mega controls the timings of the beginning and the end of acquisition for each domain. ![schematics of trial-based acquisition](resources/trial-based-acquisition.png) ## Behavioral state annotation ### Names of states Basically the idea is to take notes of the period of frames when the subject does a particular action: 1. Standing still - at the end of a corridor - before starting another trial (`AtEnd`) - waiting for the lick port to come (`Expect`) - at the junction between a corridor and the central arena (`AtCenter`) - in the midway in a corridor (`AtMidpoint`) 2. Moving backward in a corridor (`Backward`) 3. Moving forward in a corridor (`Forward`) 4. Turning in the central arena - in the left (`Left`) - in the right (`Right`) 5. Licking from the lick port (`Lick`) ### Criteria of transition Transitions from `Backward` to `Left`, or vice versa, would be one of the the most difficult decisions. Sina used the level of the subject's eyes as the reference point, and considered "in the lane (i.e. `Backward` or `Forward`) if their eyes are still in the corridor. Another difficulty is determining on when the animal starts to move. You will have to decide case-by-case in accordance with e.g. consistency of motion and positions. ### Format You can keep it as a CSV file, consisting of entries like below: | Trial | FromFrame | ToFrame | State | notes | | ----- | --------- | ------- | -------- | ---------------------- | | 1 | 1 | 4036 | AtEnd | | | 1 | 4037 | 4385 | Backward | | | 1 | 4386 | 4473 | Right | (ambiguous transition) | | 1 | 4480 | 4702 | Forward | | | ... | ... | ... | ... | ... | 1. You can enter multiple trials in one file, if you want. It is better to have a column holding the trial numbers, in any case. 2. Frames start from one. This is in accordance with how ZR View works. 3. Frames must contain the start (inclusive) and the stop (inclusive) indexes. This is because it is sometimes hard to determine the transitions, and you may want to skip some ambiguous frames for annotation. ---- Copyright (c) 2022, [Ronny Bergmann](https://orcid.org/0000-0002-1477-7502), [Keisuke Sehara](https://orcid.org/0000-0003-4368-8143), Sina E. Dominiak, [Julien Colomb](https://orcid.org/0000-0002-3127-5520), [Jens Kremkow](https://orcid.org/0000-0001-7077-4528), [Matthew E. Larkum](https://orcid.org/0000-0001-9799-2656), [Robert N. S. Sachdev](https://orcid.org/0000-0002-6627-0199), [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/)