usecases.rst 7.7 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231
  1. *****************
  2. Typical use cases
  3. *****************
  4. Recording multiple trials from multiple channels
  5. ================================================
  6. In this example we suppose that we have recorded from an 8-channel probe, and
  7. that we have recorded three trials/episodes. We therefore have a total of
  8. 8 x 3 = 24 signals, grouped into three :class:`AnalogSignal` objects, one per trial.
  9. Our entire dataset is contained in a :class:`Block`, which in turn contains:
  10. * 3 :class:`Segment` objects, each representing data from a single trial,
  11. * 1 :class:`Group`.
  12. .. image:: images/multi_segment_diagram.png
  13. :width: 75%
  14. :align: center
  15. :class:`Segment` and :class:`Group` objects provide two different
  16. ways to access the data, corresponding respectively, in this scenario, to access
  17. by **time** and by **space**.
  18. .. note:: Segments do not always represent trials, they can be used for many
  19. purposes: segments could represent parallel recordings for different
  20. subjects, or different steps in a current clamp protocol.
  21. **Temporal (by segment)**
  22. In this case you want to go through your data in order, perhaps because you want
  23. to correlate the neural response with the stimulus that was delivered in each segment.
  24. In this example, we're averaging over the channels.
  25. .. doctest::
  26. import numpy as np
  27. from matplotlib import pyplot as plt
  28. for seg in block.segments:
  29. print("Analyzing segment %d" % seg.index)
  30. avg = np.mean(seg.analogsignals[0], axis=1)
  31. plt.figure()
  32. plt.plot(avg)
  33. plt.title("Peak response in segment %d: %f" % (seg.index, avg.max()))
  34. **Spatial (by channel)**
  35. In this case you want to go through your data by channel location and average over time.
  36. Perhaps you want to see which physical location produces the strongest response, and every stimulus was the same:
  37. .. doctest::
  38. # We assume that our block has only 1 Group
  39. group = block.groups[0]
  40. avg = np.mean(group.analogsignals, axis=0)
  41. plt.figure()
  42. for index, name in enumerate(group.annotations["channel_names"]):
  43. plt.plot(avg[:, index])
  44. plt.title("Average response on channels %s: %s' % (index, name)
  45. **Mixed example**
  46. Combining simultaneously the two approaches of descending the hierarchy
  47. temporally and spatially can be tricky. Here's an example.
  48. Let's say you saw something interesting on the 6th channel (index 5) on even numbered trials
  49. during the experiment and you want to follow up. What was the average response?
  50. .. doctest::
  51. index = 5
  52. avg = np.mean([seg.analogsignals[0][:, index] for seg in block.segments[::2]], axis=1)
  53. plt.plot(avg)
  54. Recording spikes from multiple tetrodes
  55. =======================================
  56. Here is a similar example in which we have recorded with two tetrodes and
  57. extracted spikes from the extra-cellular signals. The spike times are contained
  58. in :class:`SpikeTrain` objects.
  59. * 3 :class:`Segments` (one per trial).
  60. * 7 :class:`Groups` (one per neuron), which each contain:
  61. * 3 :class:`SpikeTrain` objects
  62. * an annotation showing which tetrode the spiketrains were recorded from
  63. In total we have 3 x 7 = 21 :class:`SpikeTrains` in this :class:`Block`.
  64. .. image:: images/multi_segment_diagram_spiketrain.png
  65. :width: 75%
  66. :align: center
  67. .. note:: In this scenario we have discarded the original signals, perhaps to save
  68. space, therefore we use annotations to link the spiketrains to the tetrode
  69. they were recorded from. If we wished to include the original
  70. extracellular signals, we would add a reference to the three :class:`AnalogSignal`
  71. objects for the appropriate tetrode to the :class:`Group` for each neuron.
  72. There are three ways to access the :class:`SpikeTrain` data:
  73. * by trial (:class:`Segment`)
  74. * by neuron (:class:`Group`)
  75. * by tetrode
  76. **By trial**
  77. In this example, each :class:`Segment` represents data from one trial, and we
  78. want a PSTH for each trial from all units combined:
  79. .. doctest::
  80. plt.figure()
  81. for seg in block.segments:
  82. print(f"Analyzing segment {seg.index}")
  83. stlist = [st - st.t_start for st in seg.spiketrains]
  84. plt.subplot(len(block.segments), 1, seg.index + 1)
  85. count, bins = np.histogram(stlist)
  86. plt.bar(bins[:-1], count, width=bins[1] - bins[0])
  87. plt.title(f"PSTH in segment {seg.index}")
  88. plt.show()
  89. **By neuron**
  90. Now we can calculate the PSTH averaged over trials for each unit, using the
  91. :attr:`block.groups` property:
  92. .. doctest::
  93. plt.figure()
  94. for i, group in enumerate(block.groups):
  95. stlist = [st - st.t_start for st in group.spiketrains]
  96. plt.subplot(len(block.groups), 1, i + 1)
  97. count, bins = np.histogram(stlist)
  98. plt.bar(bins[:-1], count, width=bins[1] - bins[0])
  99. plt.title(f"PSTH of unit {group.name}")
  100. plt.show()
  101. **By tetrode**
  102. Here we calculate a PSTH averaged over trials by channel location,
  103. blending all units:
  104. .. doctest::
  105. plt.figure()
  106. for i, tetrode_id in enumerate(block.annotations["tetrode_ids"]):
  107. stlist = []
  108. for unit in block.filter(objects=Group, tetrode_id=tetrode_id):
  109. stlist.extend([st - st.t_start for st in unit.spiketrains])
  110. plt.subplot(2, 1, i + 1)
  111. count, bins = np.histogram(stlist)
  112. plt.bar(bins[:-1], count, width=bins[1] - bins[0])
  113. plt.title(f"PSTH blend of tetrode {tetrode_id}")
  114. plt.show()
  115. Spike sorting
  116. =============
  117. Spike sorting is the process of detecting and classifying high-frequency
  118. deflections ("spikes") on a group of physically nearby recording channels.
  119. For example, let's say you have recordings from a tetrode
  120. containing 4 separate channels. Here is an example showing (with fake data)
  121. how you could iterate over the contained signals and extract spike times.
  122. (Of course in reality you would use a more sophisticated algorithm.)
  123. .. doctest::
  124. # generate some fake data
  125. seg = Segment()
  126. seg.analogsignals.append(
  127. AnalogSignal([[0.1, 0.1, 0.1, 0.1],
  128. [-2.0, -2.0, -2.0, -2.0],
  129. [0.1, 0.1, 0.1, 0.1],
  130. [-0.1, -0.1, -0.1, -0.1],
  131. [-0.1, -0.1, -0.1, -0.1],
  132. [-3.0, -3.0, -3.0, -3.0],
  133. [0.1, 0.1, 0.1, 0.1],
  134. [0.1, 0.1, 0.1, 0.1]],
  135. sampling_rate=1000*Hz, units='V'))
  136. # extract spike trains from all channels
  137. st_list = []
  138. for signal in seg.analogsignals:
  139. # use a simple threshhold detector
  140. spike_mask = np.where(np.min(signal.magnitude, axis=1) < -1.0)[0]
  141. # create a spike train
  142. spike_times = signal.times[spike_mask]
  143. st = SpikeTrain(spike_times, t_start=signal.t_start, t_stop=signal.t_stop)
  144. # remember the spike waveforms
  145. wf_list = []
  146. for spike_idx in np.nonzero(spike_mask)[0]:
  147. wf_list.append(signal[spike_idx-1:spike_idx+2, :])
  148. st.waveforms = np.array(wf_list)
  149. st_list.append(st)
  150. At this point, we have a list of spiketrain objects. We could simply create
  151. a single :class:`Group` object, assign all spiketrains to it, and then also assign the
  152. :class:`AnalogSignal` on which we detected them.
  153. .. doctest::
  154. unit = Group()
  155. unit.spiketrains = st_list
  156. unit.analogsignals.extend(seg.analogsignals)
  157. Further processing could assign each of the detected spikes to an independent
  158. source, a putative single neuron. (This processing is outside the scope of
  159. Neo. There are many open-source toolboxes to do it, for instance our sister
  160. project OpenElectrophy.)
  161. In that case we would create a separate :class:`Group` for each cluster, assign its
  162. spiketrains to it, and still store in each group a reference to the original
  163. recording.
  164. .. EEG
  165. .. Network simulations