Ver código fonte

some text fixes

asobolev 3 anos atrás
pai
commit
cb659deba0
1 arquivos alterados com 19 adições e 41 exclusões
  1. 19 41
      Experiment.ipynb

+ 19 - 41
Experiment.ipynb

@@ -61,9 +61,9 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Load default session settings\n",
+    "## Load default experiment settings\n",
     "\n",
-    "For every particular experimental cofiguration one can copy the 'defaults.json' file as a specific experimental preset and load it here instead of 'defaults.json'."
+    "For every experimental cofiguration you can copy the 'defaults.json' file as a specific experimental preset and load it here instead of 'defaults.json'."
    ]
   },
   {
@@ -94,7 +94,7 @@
       "    \"arena_y\": 300,\n",
       "    \"arena_radius\": 300,\n",
       "    \"distractor_island\": 0,\n",
-      "    \"experiment_date\": \"2020-06-11_23-10-58\"\n",
+      "    \"experiment_date\": \"2020-06-12_09-25-16\"\n",
       "}\n"
      ]
     }
@@ -111,7 +111,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Open a starting screen\n",
+    "## Configure experiment in a starting screen\n",
     "\n",
     "The next cell will open a window that can be used to configure the experiment by entering the desired number of trials per session (usually limited by the number of rewards the feeder provides without refil), the session and trial durations, the size (here: 134 px in diameter = 17.87 cm) and the XY-position of the starting platform, the size of the target area, and the duration the animal has to spend in the target to recieve a reward. In addition, you can give your experiment a purposeful ID (e.g. subject ID, experiment type and date), provide information about the contrast between arena and subject, and define the duration the animal has to spend in the starting area to initialize a new trial. \n",
     "\n",
@@ -128,32 +128,7 @@
    "metadata": {
     "scrolled": true
    },
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "EXPERIMENT ID: 003901_aSIT_2020-06-11_23-10-58\n",
-      "Trials per session: 50\n",
-      "Session duration [s]: 3600\n",
-      "Trial duration [s]: 60\n",
-      "Radius of the starting platform [pixels]: 67\n",
-      "X-position of the starting platform [pixels]: 195\n",
-      "Y-position of the starting platform [pixels]: 195\n",
-      "Radius of the target platform [pixels]: 80\n",
-      "Target duration [s]: 5\n",
-      "Subject: 003901\n",
-      "Experiment type: aSIT\n",
-      "Subject is darker than background [T = True; F = False]: T\n",
-      "Initialisation Duration [s]: 0.2\n",
-      "Arena X coordinate [pixels]: 400\n",
-      "Arena Y coordinate [pixels]: 300\n",
-      "Arena radius [pixels]: 300\n",
-      "Enable distractor island [0/1]: 1\n",
-      "Experiment date: 2020-06-11_23-10-58\n"
-     ]
-    }
-   ],
+   "outputs": [],
    "source": [
     "def show_entry_fields():\n",
     "    print(\"EXPERIMENT ID: %s\" % experiment_id)\n",
@@ -228,7 +203,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "# This session will be written to this folder\n",
+    "# This session's protocols will be saved to this folder\n",
     "save_to = os.path.join('sessions', experiment_id)\n",
     "             \n",
     "if not os.path.exists(save_to):\n",
@@ -247,7 +222,9 @@
     "\n",
     "The next cell Initializes a microcontroller for subsequent hardware control. This is, where you will probably have to get creative yourself, depending on what you would like to do. Here, we use an Arduino Nano. With the channel definitions below, we can later provide differently colored illumination during the experiment (for example to stimulate with colors rather than sound) and trigger two different feeders. \n",
     "\n",
-    "For the example setup, two automatic fish feeders with 27 feeding slots each were \"hacked\", so that they can be controlled *via* two additional Arduinos with motor shields. These additional Arduinos drive the feeder motors each time they get a trigger signal from the main Arduino. The two feeders allow the provision of 54 rewards per session. The two feeders were installed at different positions above the arena and are activated alternately, to lower the predictability of where in the arena the reward will drop. The starting feeder is chosen randomly for each new session."
+    "For the example setup, two automatic fish feeders with 27 feeding slots each were \"hacked\", so that they can be controlled *via* two additional Arduinos with motor shields. These additional Arduinos drive the feeder motors each time they get a trigger signal from the main Arduino. The two feeders allow the provision of 54 rewards per session. The two feeders were installed at different positions above the arena and are activated alternately, to lower the predictability of where in the arena the reward will drop. The starting feeder is chosen randomly for each new session.\n",
+    "\n",
+    "If you DON'T have a real arduino connected, you can just still run this experiment with the Fake Feeder. This option is set by default in the next cell. The Fake Feeder will just print the text message here in this notebook when feeding."
    ]
   },
   {
@@ -266,6 +243,7 @@
     "# Feeder changes every trial, start feeder randomized\n",
     "feederID = random.choice([1, 2])\n",
     "\n",
+    "\n",
     "class RealFeeder(Arduino):\n",
     "    def __init__(self, *args, **kwargs):\n",
     "        super(RealFeeder, self).__init__(*args, **kwargs)\n",
@@ -346,7 +324,7 @@
    "source": [
     "## Protocol 2\n",
     "\n",
-    "The following cell generates a video object to which the later video feed will be saved. The colours that are defined will later be used for labeling. The labelled video file (\"ExperimentID_video.avi\") will be saved to the folder containing this notebook for documentation purposes. "
+    "The following cell generates a video object to which the later video feed will be saved. The colours that are defined will later be used for labeling. The labelled video file (\"ExperimentID_video.avi\") will be saved to the session folder for documentation purposes. "
    ]
   },
   {
@@ -381,7 +359,7 @@
     "\n",
     "The tracking algorithm used in this notebook compares the frames of the video feed during the experiment with an image of the empty arena to later track the position of the largest object in the arena (which usually is your animal). If you are confident in the stability of your video quality, it should suffice to capture the picture once and to skip this cell in the subsequent experiments. However, since this step only takes a few seconds, we recommend to take a new picture of the arena for each new experiment. In the preview of the video feed that will pop-up if you run the next cell, the space outside the arena is masked, so that the camera preview can also be used to check if the camera/arena are still positioned correctly. \n",
     "\n",
-    "Before taking the picture, make sure that the conditions in your lab (especially the illumination) are the exact same as they will be during the experiments. Once you are happy with the preview of your background image, press \"c\" to capture the image. It will be saved as \"Background.png\" to the folder containing this notebook.\n",
+    "Before taking the picture, make sure that the conditions in your lab (especially the illumination) are the exact same as they will be during the experiments. Once you are happy with the preview of your background image, press \"c\" to capture the image. It will be saved as \"background.png\" to the session folder containing this notebook.\n",
     "\n",
     "This notebook will use the main camera of your system as an input device. If you have more than one camera installed (e.g. on a notebook with internal chat camera), make sure to deactivate all cameras other than the camera of your setup  prior to running the notebook. Also make sure that the video dimensions defined here match you arena dimensions defined above and the video dimensions of the video feeds that will be defined in the subsequent cells."
    ]
@@ -470,7 +448,7 @@
    "source": [
     "## Initialize the camera\n",
     "\n",
-    "This cell initializes the camera for the actual tracking and defines some counters and dummy variables needed during the experiment."
+    "This cell initializes the camera for the actual tracking and defines some counters and initial variables needed during the experiment."
    ]
   },
   {
@@ -508,18 +486,18 @@
    "source": [
     "## Protocol 3\n",
     "\n",
-    "The following cell generates an Excel-file to which the essential data (i.e. animal position, positions of the target areas, etc.) from each cycle (video frame) of the experiment will be saved. The Excel-file (\"ExperimentID_protocol.xlsx\") will be saved to the folder containing this notebook. "
+    "The following cell generates a CSV-file to which the essential data (i.e. animal position, positions of the target areas, etc.) from each cycle (video frame) of the experiment will be saved. The CSV-file (\"ExperimentID_protocol.csv\") will be saved to the session folder inside the folder containing this notebook. "
    ]
   },
   {
    "cell_type": "code",
-   "execution_count": 11,
+   "execution_count": 14,
    "metadata": {},
    "outputs": [],
    "source": [
     "def log_frame_data(args):\n",
     "    with open(os.path.join(save_to, experiment_id + '_protocol.csv'), 'a') as f:\n",
-    "        f.write(\",\".join([repr(x) for x in args]) + \"\\n\")\n",
+    "        f.write(\",\".join([str(x) for x in args]) + \"\\n\")\n",
     "\n",
     "headers = [\n",
     "    'FrameID',            # Frame ID\n",
@@ -536,7 +514,9 @@
     "    'TrialID',            # Trial ID\n",
     "    'Rewarded Trials [%]',# Percentage of trials rewarded\n",
     "    'Sound Played'        # sound played\n",
-    "]"
+    "]\n",
+    "\n",
+    "log_frame_data(headers)   # saves headers to the log file"
    ]
   },
   {
@@ -682,8 +662,6 @@
     "            subjectHullCentroidArenaStart = cv2.circle(subjectHullCentroidArenaStart,(distractorX,distractorY), cfg['target_radius'], (255,0,0), 2)\n",
     "    else:\n",
     "        subjectHullCentroidArenaStart = cv2.circle(subjectHullCentroidArena,(cfg['start_x'], cfg['start_x']), cfg['start_radius'], (255,0,255), 2)\n",
-    "\n",
-    "        \n",
     "        \n",
     "    # Adds a stopwatch for the experiment duration to the video\n",
     "    subjectHullCentroidArenaStartInText=cv2.putText(subjectHullCentroidArenaStartIn,\n",