Scheduled service maintenance on November 22


On Friday, November 22, 2024, between 06:00 CET and 18:00 CET, GIN services will undergo planned maintenance. Extended service interruptions should be expected. We will try to keep downtimes to a minimum, but recommend that users avoid critical tasks, large data uploads, or DOI requests during this time.

We apologize for any inconvenience.

example2.m 3.1 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283
  1. %% Example 2: Inspect the model fits obtained by analyzePRF
  2. %% Download dataset (if necessary) and add analyzePRF to the MATLAB path
  3. setup;
  4. %% Load and analyze the data (this follows the example1.m script)
  5. % new parallel processing method [CK]
  6. if ~exist('mypool','var')
  7. mypool=parpool();
  8. end
  9. % Load in the data
  10. load('exampledataset.mat');
  11. % Upsample the data to match the stimulus rate
  12. data = tseriesinterp(data,2,1,2);
  13. % Analyze the data
  14. results = analyzePRF(stimulus,data,1,struct('seedmode',[0 1],'display','off'));
  15. %%
  16. %% Perform some setup
  17. % Define some variables
  18. res = [100 100]; % row x column resolution of the stimuli
  19. resmx = 100; % maximum resolution (along any dimension)
  20. hrf = results.options.hrf; % HRF that was used in the model
  21. degs = results.options.maxpolydeg; % vector of maximum polynomial degrees used in the model
  22. % Pre-compute cache for faster execution
  23. [d,xx,yy] = makegaussian2d(resmx,2,2,2,2);
  24. % Prepare the stimuli for use in the model
  25. stimulusPP = {};
  26. for p=1:length(stimulus)
  27. stimulusPP{p} = squish(stimulus{p},2)'; % this flattens the image so that the dimensionality is now frames x pixels
  28. stimulusPP{p} = [stimulusPP{p} p*ones(size(stimulusPP{p},1),1)]; % this adds a dummy column to indicate run breaks
  29. end
  30. % Define the model function. This function takes parameters and stimuli as input and
  31. % returns a predicted time-series as output. Specifically, the variable <pp> is a vector
  32. % of parameter values (1 x 5) and the variable <dd> is a matrix with the stimuli (frames x pixels).
  33. % Although it looks complex, what the function does is pretty straightforward: construct a
  34. % 2D Gaussian, crop it to <res>, compute the dot-product between the stimuli and the
  35. % Gaussian, raise the result to an exponent, and then convolve the result with the HRF,
  36. % taking care to not bleed over run boundaries.
  37. modelfun = @(pp,dd) conv2run(posrect(pp(4)) * (dd*[vflatten(placematrix(zeros(res),makegaussian2d(resmx,pp(1),pp(2),abs(pp(3)),abs(pp(3)),xx,yy,0,0) / (2*pi*abs(pp(3))^2))); 0]) .^ posrect(pp(5)),hrf,dd(:,prod(res)+1));
  38. % Construct projection matrices that fit and remove the polynomials.
  39. % Note that a separate projection matrix is constructed for each run.
  40. polymatrix = {};
  41. for p=1:length(degs)
  42. polymatrix{p} = projectionmatrix(constructpolynomialmatrix(size(data{p},2),0:degs(p)));
  43. end
  44. %% Inspect the data and the model fit
  45. % Which voxel should we inspect? Let's inspect the second voxel.
  46. vx = 2;
  47. % For each run, collect the data and the model fit. We project out polynomials
  48. % from both the data and the model fit. This deals with the problem of
  49. % slow trends in the data.
  50. datats = {};
  51. modelts = {};
  52. for p=1:length(data)
  53. datats{p} = polymatrix{p}*data{p}(vx,:)';
  54. modelts{p} = polymatrix{p}*modelfun(results.params(1,:,vx),stimulusPP{p});
  55. end
  56. % Visualize the results
  57. figure; hold on;
  58. set(gcf,'Units','points','Position',[100 100 1000 100]);
  59. plot(cat(1,datats{:}),'r-');
  60. plot(cat(1,modelts{:}),'b-');
  61. straightline(300*(1:4)+.5,'v','g-');
  62. xlabel('Time (s)');
  63. ylabel('BOLD signal');
  64. ax = axis;
  65. axis([.5 1200+.5 ax(3:4)]);
  66. title('Time-series data');