|
@@ -14,11 +14,11 @@ Large-scale electrophysiological recordings from V1, V4 and IT in two macaques i
|
|
|
|
|
|
N.B. The THINGS stimuli from Martin Hebart (et al.) are not provided, but you can download them on [things-initiative.org](http://things-initiative.org)
|
|
N.B. The THINGS stimuli from Martin Hebart (et al.) are not provided, but you can download them on [things-initiative.org](http://things-initiative.org)
|
|
|
|
|
|
-### Info
|
|
|
|
|
|
+## Info
|
|
|
|
|
|
A full description of the dataset is provided in the Neuron paper. A few additional things are needed to be able to work on the data.
|
|
A full description of the dataset is provided in the Neuron paper. A few additional things are needed to be able to work on the data.
|
|
|
|
|
|
-**RAW**
|
|
|
|
|
|
+### RAW
|
|
|
|
|
|
Data for each monkey is provided in a folder, containing both the RAW and MUA data. The RAW data is subdived in different days of recordings, and individual blocks/runs of ~20 minutes of lenght. We provide the MATLAB code to extract the MUA out of RAW data, aggregate all the trials across blocks and days, normalize it, filter and chunck it for model training. These scripts can be easily changed to extract LFP, or to aggregate the RAW data, or look into not-completed images.
|
|
Data for each monkey is provided in a folder, containing both the RAW and MUA data. The RAW data is subdived in different days of recordings, and individual blocks/runs of ~20 minutes of lenght. We provide the MATLAB code to extract the MUA out of RAW data, aggregate all the trials across blocks and days, normalize it, filter and chunck it for model training. These scripts can be easily changed to extract LFP, or to aggregate the RAW data, or look into not-completed images.
|
|
|
|
|
|
@@ -30,7 +30,8 @@ These scripts can be found in "_code" and are (in sequence):
|
|
|
|
|
|
The MUA data is provided both un-normalized ("THINGS_MUA_trials.mat") and normalized and averaged in time-windows ("THINGS_normMUA.mat")
|
|
The MUA data is provided both un-normalized ("THINGS_MUA_trials.mat") and normalized and averaged in time-windows ("THINGS_normMUA.mat")
|
|
|
|
|
|
-**MUA**
|
|
|
|
|
|
+### MUA
|
|
|
|
+#### (if interested in the full time-course or decoding)
|
|
|
|
|
|
THINGS_MUA_trials.mat contains:
|
|
THINGS_MUA_trials.mat contains:
|
|
|
|
|
|
@@ -39,7 +40,8 @@ THINGS_MUA_trials.mat contains:
|
|
- tb: time, in ms, w.r.t. the stimulus onset, correspoing to the elements in "time-points" in "ALLMUA".
|
|
- tb: time, in ms, w.r.t. the stimulus onset, correspoing to the elements in "time-points" in "ALLMUA".
|
|
|
|
|
|
|
|
|
|
-**normalized MUA**
|
|
|
|
|
|
+### normalized MUA
|
|
|
|
+#### (if interested in modeling and tuning)
|
|
|
|
|
|
THINGS_normMUA.mat contains:
|
|
THINGS_normMUA.mat contains:
|
|
|
|
|
|
@@ -54,7 +56,7 @@ THINGS_normMUA.mat contains:
|
|
- tb: time, in ms, w.r.t. the stimulus onset, correspoing to the elements in "time-points" in "ALLMUA".
|
|
- tb: time, in ms, w.r.t. the stimulus onset, correspoing to the elements in "time-points" in "ALLMUA".
|
|
|
|
|
|
|
|
|
|
-**Stimuli**
|
|
|
|
|
|
+### Stimuli
|
|
|
|
|
|
The scripts and data rely on logfiles hosted in "_logs". There, you can also find "things_imgs.mat" that is required to associate each stimulus from the THINGS initiative database to the specific trial (see below). things_imgs.mat contains:
|
|
The scripts and data rely on logfiles hosted in "_logs". There, you can also find "things_imgs.mat" that is required to associate each stimulus from the THINGS initiative database to the specific trial (see below). things_imgs.mat contains:
|
|
|
|
|
|
@@ -63,7 +65,7 @@ The scripts and data rely on logfiles hosted in "_logs". There, you can also fin
|
|
|
|
|
|
N.B. the normalized data is already sorted according to the order of images in "train_imgs" and "test_imgs" from "things_imgs.mat"
|
|
N.B. the normalized data is already sorted according to the order of images in "train_imgs" and "test_imgs" from "things_imgs.mat"
|
|
|
|
|
|
-**Code**
|
|
|
|
|
|
+### Code
|
|
|
|
|
|
In addition to the scripts mentioned, we provide the Matlab APIs from Blackrock Neurotech, the same version used for the paper. Also, we provide a few util functions that are called by the main scripts or can be used to plot some of the results. Finally, we provide the Python code for the MEIs in "lucent-things", based on the lucent viz library [GitHub link](http://https://github.com/greentfrapp/lucent)
|
|
In addition to the scripts mentioned, we provide the Matlab APIs from Blackrock Neurotech, the same version used for the paper. Also, we provide a few util functions that are called by the main scripts or can be used to plot some of the results. Finally, we provide the Python code for the MEIs in "lucent-things", based on the lucent viz library [GitHub link](http://https://github.com/greentfrapp/lucent)
|
|
|
|
|
|
@@ -108,7 +110,13 @@ Detailed description of the gin client can be found at the [gin wiki](https://gi
|
|
|
|
|
|
### Using the web browser
|
|
### Using the web browser
|
|
|
|
|
|
-Download the files you want by clicking download in the gin web interface. Convenience summary tables of the data and sessions can be found in the summary above.
|
|
|
|
|
|
+Download the files you want by clicking download in the gin web interface.
|
|
|
|
+
|
|
|
|
+If you are interested in modeling/tuning, you can download the normalized MUA for each monkey by clicking here:
|
|
|
|
+
|
|
|
|
+monkey N: https://gin.g-node.org/paolo_papale/TVSD/raw/master/monkeyN/THINGS_normMUA.mat
|
|
|
|
+
|
|
|
|
+monkey F: https://gin.g-node.org/paolo_papale/TVSD/raw/master/monkeyF/THINGS_normMUA.mat
|
|
|
|
|
|
## Citation policy
|
|
## Citation policy
|
|
|
|
|