# Hladnik & Grewe: Receptive field sizes and neuronal encoding bandwidth are constrained by axonal conduction delays. This repository contains raw data of neuronal activity in P-type electroreceptors of the electric fish *Apteronotus leptorhynchus*, derived data i.e. analysis results, the python scripts running the analyses, simulations and the plotting. ## ``raw_data`` folder This folder contains the raw datasets. Data and metadata are stored in the open [NIX](https://github.com/g-node/nix) container format. To read these data files you need the [``nixio``](https://github.com/g-node/nixpy) python library. For a more comfortable approach use the [``rlxnix``](https://github.com/relacs/relacsed_nix) python package. Usually, the files contain the raw membrane potential recording, the detected action potentials, the animal's electric organ discharge (EOD) together with the respective metadata. *Note:* Some datasets are older and have been used in previously analyses. The dataset structure deviates a bit from newer files. ## ``derived_data`` folder This folder contains the analysis results. The contained files will be overwritten when re-running the analysis scripts. *csv-files are written with ``pandas`` using the semicolon as delimiter. To read them use something like the following pattern ``df = pd.read_csv(filename, sep=";", index_col=0)``. *npz-files are compressed ``numpy`` files. ## ``stimuli`` folder Contains a single file, which is the white noise stimulus waveform. ## ``file_lists`` folder Two simple text files that are used by the analysis scripts: 1. listing the datasets in which the receptive field was measured, and from which we extract the baseline response properties. 2. The datasets in which white-noise responses were measured. ## ``figures`` folder Contains some png-files which will be embedded into the final figures. The folder is the default output folder for created figures. ## ``code`` package python files performing the analyses, simulations and plotting. Three sub-packages: 1. ``analyses``: The data analysis scripts working on the raw data 2. ``plots``: Plotting scripts 3. ``simulations``: LIF model simulations ## Scripts To run the analyses, simulations, or plot the figures, you can use one of the three scripts 1. ``run_analyses.py`` command line tool for running all data analyses. (**Note: analysis of the heterogeneous populations will take a while!**) 2. ``run_simulations.py`` command line tool to run the LIF simulations. (**Note: This will take a while!**) 3. ``plot_figures.py`` command line script that allows to plot the figures of the publication. The first two take only a single optional argument which is the number of parallel processes that should be spawned for the respective task. E.g. ``` python3 run_simulations.py --help # to show the help text python3 run_simulations.py -j 12 # run the simulations using 12 parallel processes ``` The plot tool expects as a first argument the sub command (i.e. figure) that will be run. Each of these sub-commands define a few more arguments. Using the default arguments should work. ``` python3 plot_figures.py --help # to list all available subcommands ``` For example: ``` python3 plot_figures.py figure1 -h # to show the options for figure 1 python3 plot_figures.py figure1 -n # to plot figure 1 and show it. ``` If the ``-n`` argument is given, the figure will be shown but not saved. If the option is omitted, the figure will be saved to the ``figures`` folder, but not shown. The default arguments should lead to the same figures as in the paper. ## Dependencies The scripts depend on the following packages (installed version): * python (3.8.10) * numpy (1.19.2) * pandas (1.1.0) * scipy (1.7.0) * nixio (1.5.2) * rlxnix (0.6.10, for [installation options see](https://github.com/relacs/relacsed_nix)) * joblib (0.16.0) * sklearn (0.23.1) * matplotlib (3.4.2)