Scheduled service maintenance on November 22


On Friday, November 22, 2024, between 06:00 CET and 18:00 CET, GIN services will undergo planned maintenance. Extended service interruptions should be expected. We will try to keep downtimes to a minimum, but recommend that users avoid critical tasks, large data uploads, or DOI requests during this time.

We apologize for any inconvenience.

Tutorials and Exercises accompanying lectures of ANDA-NI 2024.

Michael Denker 1cabbafaed gin commit from PF3RD7D6 há 2 semanas atrás
Brovelli ca28927b05 Added lectures by Andrea Brovelli há 1 mês atrás
Ernst 1cabbafaed gin commit from PF3RD7D6 há 2 semanas atrás
Gruen b951a7bc2b Added exercises Sonja Gruen há 3 semanas atrás
Nawrot eb953dcbf5 Added exercises Martin Nawrot há 3 semanas atrás
Yu 42cd2d2361 gin commit from PF3RD7D6 há 2 semanas atrás
.gitignore e63c6483a0 Added gitignore statements há 1 mês atrás
LICENSE 27d363029c Initial commit há 1 mês atrás
README.md 14fb7fe137 gin commit from PF3RD7D6 há 2 semanas atrás

README.md

ANDA-NI Tutorials 2024

Udo Ernst: Spectral Analysis

Instructions for the exercises are provided in the file ANDA2024_Training_Spectral.pdf.

Exercises for individual methods are contained in the test-* files. The second part of the exercise is contained in the notebook ANDA2024_Spectral_DataAnalysis.ipynb.

To get started, create a Python environment using the environment.yml file provided.

Andrea Brovelli: Neuronal Interactions

The folder tutorials/notebooks contains a series of 5 exercises based on the Frites software package, which are replicated from https://github.com/brainets/CookingFrites. Part 0 explains the usage of xarray, a Python package used to represent data in Frites. Tutorials 1-4 then cover the various stages of a Frites workflow based on an example sEEG dataset. Information on the various stages of the workflow, and on the dataset details are located in the folder tutorials/slides.html. Each tutorial goes through a sequence of processing steps on the example datasets, and ends on a practical exercise to explore the dataset further.

Another set of tutorials can be found on the EBRAINS Collaboratory (requires free registration, allows online execution): https://lab.jsc.ebrains.eu/hub/login?next=%2Fhub%2Fapi%2Foauth2%2Fauthorize%3Fclient_id%3Djupyterhub-user-brovelli%26redirect_uri%3D%252Fuser%252Fbrovelli%252Foauth_callback%26response_type%3Dcode%26state%3DeyJ1dWlkIjogIjhjNWMxM2I2ZjgxOTRjOGU5ODhjMjYzYmEwNGE5NWZmIiwgIm5leHRfdXJsIjogIi91c2VyL2Jyb3ZlbGxpL2xhYi90cmVlL3NoYXJlZC9FQlJBSU5TJTIwQWNhZGVteSUyMFdvcmtzaG9wJTIwU2VyaWVzJTIwU2Vzc2lvbiUyMDElMjAlRTIlODAlOTMlMjBpbnRyYWNyYW5pYWwlMjBFRUcifQ

Byron Yu: Dimensionality Reduction

The notebook tutorials/Exercise_PCA.ipynb contains a the primary exercise centered around implementing a simple principle component analysis (PCA). Also, in tutorials/Tutorial_GPFA.ipynb you will find a tutorial guiding you through the application of the GPFA implementation of Elephant on an example dataset. For this tutorial, a video tutorial is available (see link in notebook).

The notebook tutorials/Exercise_PCA_to_FA.ipynb contains a mathematically more advanced exercise covering both PCA and the transition to Factor Analysis (FA).

To get started, create a Python environment using the environment.yml file provided.

In addition, Byron Yu's graphical Matlab-based tool DataHigh and tutorials are available at https://users.ece.cmu.edu/~byronyu/software/DataHigh/datahigh.html and https://github.com/BenjoCowley/DataHigh

Martin Nawrot: Higher-order Correlations

The notebook tutorials/trial_by_trial_variability.ipynb contains an exercise centered around time-resolved Fano Factors. The first lines of the exercises contain code that help you load the used datasets.

To get started, create a Python environment using the environment.yml file provided.

Sonja Grün: Higher-order Correlations

The folder tutorials contains a series of 4 exercise notebooks. Exercise 0 should be considered more of a warm-up exercise, and exercises 1-3 cover cross-correlations, Unitary Event Analysis and higher-order correlations,respectively. The notebooks contain working exercise codes and additional tasks that build on the provided code.

To get started, create a Python environment using the environment.yml file provided.