Bläddra i källkod

try to fix broken links

Keisuke Sehara 4 år sedan
förälder
incheckning
1ed4bf912c
1 ändrade filer med 13 tillägg och 27 borttagningar
  1. 13 27
      README.md

+ 13 - 27
README.md

@@ -5,23 +5,21 @@ to the Human Brain Project Collab.
 
 ### Contents
 
-- [How this dataset was acquired](README.md#desc-root)
-  - [Setup](README.md#setup-desc)
-  - [Task](README.md#task-desc)
-  - [Acquisition and basic analysis](README.md#acquisition-desc)
-    - [Raw videos](README.md#raw)
-    - [Behavioral states](README.md#states)
-    - [Body-part tracking using UV paint](README.md#tracking)
-- [How this dataset can be read](README.md#walkthrough)
-- [Repository information](README.md#repository-root)
+- [How this dataset was acquired](#how-this-dataset-was-acquired)
+  - [Setup](#setup)
+  - [Task](#task)
+  - [Acquisition and basic analysis](#acquisition-and-basic-analysis)
+    - [Raw videos](#raw-videos)
+    - [Behavioral states](#behavioral-states)
+    - [Body-part tracking using UV paint](#body-part-tracking-using-uv-paint)
+- [How this dataset can be read](#how-this-dataset-can-be-read)
+- [Repository information](#repository-information)
   - [Authors](README.md#authors)
   - [License](README.md#license)
 
-<a link="#desc-root" />
 
-## How this dataset was acquired
 
-<a link="#setup-desc" />
+## How this dataset was acquired
 
 ### Setup
 
@@ -33,7 +31,7 @@ We used the Airtrack platform ([Nashaat et al., 2016](https://doi.org/10.1152/jn
 
 The position of the floating plus maze was monitored using an Arduno microcontroller, which  controlled of the behavioral task at the same time. For more details of the setup, please refer to https://www.neuro-airtrack.com .
 
-<a link="#task-desc" />
+
 
 ### Task
 
@@ -52,22 +50,16 @@ The water-restricted, head-fixed mouse performed a simple plus-maze task:
 
 
 
-<a link="#acquisition-desc" />
-
 ### Acquisition and basic analysis
 
 <img src="images/acquisition.png" alt="Sample images from high-speed video acquisition" style="zoom:50%;" />
 
 A high-speed camera acquired whisking behavior of the animal.
 
-<a link="#raw" />
-
 #### Raw videos
 
 The raw videos may be found in the `videos` domain of the `raw` dataset.
 
-<a link="#states" />
-
 #### Behavioral states
 
 For some videos of the dataset, the behavioral states of the animal were manually annotated:
@@ -84,8 +76,6 @@ The data was stored as CSV files (the `states` domain of the `tracking` dataset)
 
 Note that the frame numbers in the videos are indicated as _one_-starting numbers, i.e. the first frame of the video was referred to as the frame 1.
 
-<a link="#tracking" />
-
 #### Body-part tracking using UV paint
 
 To facilitate tracking procedures, a small amount of UV paint (["UV glow"](http://www.uvglow.co.uk/), to be precise) was applied to the body parts of interest. Some mice only received paints on their whiskers, but the others have their nose painted as well (**Left** in the figure above).
@@ -102,7 +92,7 @@ In brief, the procedures were as follows:
 3. Within each ROI in every frame, pixels that has the specified hue (i.e. color-balance) values were collected, and the luminance-based weighted average of their positions were defined as the tracked position of the body part.
 4. The tracked positions were stored as a CSV file (see the `tracked` domain of the `tracking` dataset).
 
-<a link="#walkthrough" />
+
 
 ## How this dataset can be read
 
@@ -110,18 +100,14 @@ We prepared the [helper.py](helper.py) python module for easing the dataset read
 
 General information about the Jupyter program can be found [here](https://jupyter.org/).
 
-<a link="#repository-root" />
 
-## Repository information
 
-<a link="#authors" />
+## Repository information
 
 ### Authors
 
 Please refer to [REPOSITORY.json](REPOSITORY.json).
 
-<a link="#license" />
-
 ### License
 
 [Creative Commons CC0 1.0 Universal](https://creativecommons.org/publicdomain/zero/1.0/)