Scheduled service maintenance on November 22


On Friday, November 22, 2024, between 06:00 CET and 18:00 CET, GIN services will undergo planned maintenance. Extended service interruptions should be expected. We will try to keep downtimes to a minimum, but recommend that users avoid critical tasks, large data uploads, or DOI requests during this time.

We apologize for any inconvenience.

Tell us about your digital needs and habits. (Responses) - Form Responses 1.tsv 10.0 KB

1234567891011121314151617
  1. Timestamp Indicate your name (+ orcid id) Indicate your twitter handle How much space will you need to save all the data produced during the next year. Please give an estimation in TB. How big is your biggest data file (one unique file) Are you working for the Human brain project position Your operating system How do you document your lab work What is your data back up strategy? Have you ever published data Have you ever published data, if yes where Are you using these software/commands at your work here. What git based tools you are already using Programming language knowledge [R] Programming language knowledge [python] Programming language knowledge [matlab] Programming language knowledge [other] Name of the other programming language(s) How often do you use command line functions (terminal) Are some experiments running on an old OS (pre-windows 7), if yes which one and do you plan on upgrading soon ? Please, feel welcome to add free text comments about your data flow, software use/need or question you might have.
  2. 5/8/2019 16:55:02 Jiyun Shin @jshin92 2 TB 100 MB < x < 1TB No Phd student Windows I have one/several paper notebook Charite server + hard drive No slack None None I can read it, but not write it I managed to get things done, somewhat None only when there is no other solution
  3. 5/13/2019 12:47:47 Christian Ebner, https://orcid.org/0000-0002-5421-7139 NA 0.2 100 MB < x < 1TB Yes Phd student Windows I have one/several paper notebook I am using network drive space to store copies of the raw data. The Charité network drive (Larkum lab space) for Ephys data and OMERO for microscopy images. No slack, latex, Dropbox None None I can read it, but not write it I can teach it I can teach it HOC/NMODL (NEURON), some C++ only when there is no other solution Nope.
  4. 5/21/2019 13:04:40 Jaan Aru @jaaanaru <1 TB x <100 MB No Postdoc Windows I have one/several paper notebook external hard drive Yes osf.io slack, latex None I can read it, but not write it I can read it, but not write it I can teach it only when there is no other solution no
  5. 4/24/2019 16:27:31 Albert Gidon NA 1-2 TB 100 MB < x < 1TB Yes Postdoc Windows I use another electronic lab notebook Network backup No slack, markdown None I can teach it I can teach it I can teach it C,C++,C#, java, javascript, vb, igor, neuron,sql,... frequently No. I am started using Veusz a year ago (opensource, and python front-end) as a preliminary step to publishing data. It holds both the data and publication quality graphs and can retrieve data from multiple database platforms (in my case sqlite).
  6. 4/24/2019 22:05:31 Keisuke Sehara NA 1 TB 100 MB < x < 1TB Yes Postdoc Mac OS I have one/several paper notebook Git, External HD No slack, git, latex, markdown GitHub, GitLab, GIN I managed to get things done, somewhat I can teach it I managed to get things done, somewhat frequently XP and Vista for experiments. No plan of upgrading I want everything to be done fast and automatic. I don't want to spend my time describing about the data format (and being told that my way does not conform to some specifications), or waiting for hours for a file to be uploaded (only to know next morning that it failed). To me, describing about things something with second importance. I want to avoid it, but I feel obliged to do it because I think it is important to some extent. Otherwise, I would consider it to be a waste of time. There are other important issues that I must put efforts into. In this sense, building an internal server sounds like a very good first step. I would be very happy if pushing large data becomes easier and faster.
  7. 4/26/2019 11:01:12 Anna Nasr NA 4TB 100 MB < x < 1TB No Undergraduate Windows I use a paper notebook and digital documents Data is stored on 3 different hard drives and once per week a backup No slack, Dropbox None None I can teach it I managed to get things done, somewhat I can teach it C++,C#,Java,Javascript,MySQL,PHP frequently No
  8. 7/3/2019 13:11:59 Hatem Oraby Inactive account so, N/A 2.5TB 100 MB < x < 1TB No Phd student Windows part papernotebook + part stored as part of animals config Onedrive account + googledrive backup + an external harddisk that I'm not actively backing up Yes Github + airtrack website. Both, shamefully, are incomplete work but seems good enough for people to start sending us more questions when they are stuck slack, git, markdown GitHub I can read it, but not write it I can teach it I managed to get things done, somewhat I can teach it D, C, Arduino, bash scripting and few things here and ther frequently None My first and utmost priority is for the data to be automatically backed up once they are acquired, I don't want for anyone to go few steps to get things backed up, otherwise a mess starts to happen. I have few restrictions imposed on our workflow, we use MATLAB, which we can't ditch on the short-run, and our license comes from HU. Accordingly to use MATLAB, we have to be on the HU VPN. I also try to have a remote desktop access for easier debugging, I use TeamViewer but recently I started switching to chrome remote desktop. All our training machines are using the same OneDrive account, this is great for couple of reasons: 1. Animals data and run configuration are shared among all computers. 2. They share the code base for the experiment, I update it on the onedrive webview and it automatically propagates to all machines. - One shortcoming of such system is that each Bpod (our training behavior system) has its own configuration (e.g Calibration profile of the solenoids reaction time). To get around that while still backing up the files, we use symbolic links where all Bpods use the same configuration folder, but on each computer, the folder points to a different backed-up directory - For Onedrive, if you are running Windows 10, then each computer downloads only the parts it needs so that saves on space. Unfortunately, we have couple of Windows 7 computers and they have to synchronize and download everything. I think we can update one computer to Windows 10 but the second is difficult to update for the time being. Our data analysis is split between Python and Matlab. For Matlab, we are reusing code base from a lab that we are collaborating with, Adam Kepecs lab. Anything I'm writing new, I write in Python and Matplotlib. I'm not a very good user of numpy, but I like to use Pandas to analyze and work my data. For what I like to have, I would like to have: 1. Automated backup for data (which we have for the most part) 2. Putting git on the training machines creates a small files that slows the computers, and makes simple code uploads more tedious. Therefore I don't upload the git folders, but on the other hand there is no easy way for the people who train to switch back to previous version of the code if I introduced a bug one day and I wasn't available on site. 3. Ideally, I would like to trigger daily chron jobs to do some post-processing on the acquired data. I'm happy to go into more details in any of these points.
  9. 4/24/2019 15:52:03 David Kaplan NA Approx 1TB 100 MB < x < 1TB No Postdoc Windows Paper notebooks and one note Dropbox, lab server and external hardrive Yes During my time in the lab I contributed to Julie's nat. comm. paper slack, Dropbox None None I managed to get things done, somewhat I can teach it NEURON, very basic C++ only when there is no other solution No I would be curious about ideas you have for organising analysis scripts
  10. 5/8/2019 11:35:17 Mostafa Nashaat @MEnashaat 5 TB 100 MB < x < 1TB No Postdoc Mac OS electronic and paper notebooks Backup harddrive, Google drive Yes J Neurophys, Eneuro, Nature Neuroscience, J Neuroscience slack None I managed to get things done, somewhat I can read it, but not write it only when there is no other solution nope
  11. 5/18/2019 14:19:19 Marcel Staab Marci33789590 4 x <100 MB Yes Undergraduate Windows I have one/several paper notebook Have my data on external hardrives and Google drive No slack, git, Dropbox GitHub None I managed to get things done, somewhat None None never / What is that? None I just started a o learn python for analysis
  12. 8/7/2019 23:13:32 Moritz Drueke NA less than 1 100 MB < x < 1TB No Undergraduate Mac OS I have one/several paper notebook private drives Yes Small, BioRxiv slack, latex None I can read it, but not write it I managed to get things done, somewhat I managed to get things done, somewhat I managed to get things done, somewhat Java only when there is no other solution No My Setup's PCs are badly connected to the lab servers, makes it hard to backup data properly.
  13. 4/24/2019 16:25:35 Naoya Takahashi, https://orcid.org/0000-0002-6008-4627 @na0ya_takahashi 2 TB 100 MB < x < 1TB No Postdoc Mac OS I use another electronic lab notebook Backuped to external HDDs using TrueImage (Windows) or Time Machine (Mac) No slack, Dropbox None None None I can teach it I managed to get things done, somewhat Igor, Visual Basic, C++ only when there is no other solution My Ephys PC is running on Windows 7. I don't have a plan to upgrade it.
  14. 5/8/2019 12:51:40 Christina Bocklisch NA 0.5 TB 100 MB < x < 1TB No Postdoc Windows I have one/several paper notebook Charite Server Yes Science, J Neurosci, Front Cell Neurosci, PLoS One slack, Dropbox None None None None None
  15. 5/20/2019 10:30:34 Malinda Tantirigama, https://orcid.org/0000-0003-0791-9389 MalindaLST 2 100 MB < x < 1TB No Postdoc Windows I have one/several paper notebook backup to external HDD No slack, Dropbox None I managed to get things done, somewhat None I can teach it never / What is that? no
  16. 6/3/2019 19:12:49 Robert Sachdev (https://orcid.org/0000-0002-6627-0199) Robert.sachdev 30 TB 100 MB < x < 1TB Yes Postdoc Windows I use another electronic lab notebook backup hard disks No slack, Dropbox GitHub, GitLab None I managed to get things done, somewhat I managed to get things done, somewhat Scripts for spike2, Matthematica only when there is no other solution No
  17. 5/19/2019 15:31:20 Tim NA 0.5 100 MB < x < 1TB No Postdoc Windows I use another electronic lab notebook external hard drive Yes J Phys. slack None I can read it, but not write it only when there is no other solution WindowsXP, No plans