Input data size [72,1020000] = 72 channels, 1020000 frames Finding 72 ICA components using extended ICA. Kurtosis will be calculated initially every 1 blocks using 6000 data points. Decomposing 196 frames per ICA weight ((5184)^2 = 1020000 weights, Initial learning rate will be 0.001, block size 70. Learning rate will be multiplied by 0.98 whenever angledelta >= 60 deg. More than 32 channels: default stopping weight change 1E-7 Training will end when wchange < 1e-07 or after 512 steps. Online bias adjustment will be used. Removing mean of each channel ... Final training data range: -3.58127 to 2.98392 Computing the sphering matrix... Starting weights are the identity matrix ... Sphering the data ... Beginning ICA training ... first training step may be slow ... Lowering learning rate to 0.0009 and starting again. Lowering learning rate to 0.00081 and starting again. Lowering learning rate to 0.000729 and starting again. Lowering learning rate to 0.0006561 and starting again. Lowering learning rate to 0.00059049 and starting again. Lowering learning rate to 0.000531441 and starting again. Lowering learning rate to 0.000478297 and starting again. Lowering learning rate to 0.000430467 and starting again. Lowering learning rate to 0.00038742 and starting again. Lowering learning rate to 0.000348678 and starting again. Lowering learning rate to 0.000313811 and starting again. Lowering learning rate to 0.00028243 and starting again. Lowering learning rate to 0.000254187 and starting again. Lowering learning rate to 0.000228768 and starting again. Lowering learning rate to 0.000205891 and starting again. Lowering learning rate to 0.000185302 and starting again. Lowering learning rate to 0.000166772 and starting again. Lowering learning rate to 0.000150095 and starting again. Lowering learning rate to 0.000135085 and starting again. Lowering learning rate to 0.000121577 and starting again. Lowering learning rate to 0.000109419 and starting again. Lowering learning rate to 9.84771e-05 and starting again. Lowering learning rate to 8.86294e-05 and starting again. Lowering learning rate to 7.97664e-05 and starting again. Lowering learning rate to 7.17898e-05 and starting again. Lowering learning rate to 6.46108e-05 and starting again. Lowering learning rate to 5.81497e-05 and starting again. Lowering learning rate to 5.23348e-05 and starting again. Lowering learning rate to 4.71013e-05 and starting again. Lowering learning rate to 4.23912e-05 and starting again. Lowering learning rate to 3.8152e-05 and starting again. Lowering learning rate to 3.43368e-05 and starting again. Lowering learning rate to 3.09032e-05 and starting again. Lowering learning rate to 2.78128e-05 and starting again. Lowering learning rate to 2.50316e-05 and starting again. Lowering learning rate to 2.25284e-05 and starting again. Lowering learning rate to 2.02756e-05 and starting again. Lowering learning rate to 1.8248e-05 and starting again. step 1 - lrate 0.000018, wchange 14103454346397174.00000000, angledelta 0.0 deg Lowering learning rate to 1.31386e-05 and starting again. step 1 - lrate 0.000013, wchange 431079603476.10021973, angledelta 0.0 deg Lowering learning rate to 9.45977e-06 and starting again. step 1 - lrate 0.000009, wchange 238728499.73190325, angledelta 0.0 deg Lowering learning rate to 8.51379e-06 and starting again. step 1 - lrate 0.000009, wchange 34694992.14831636, angledelta 0.0 deg step 2 - lrate 0.000009, wchange 1202538861769166.50000000, angledelta 0.0 deg Lowering learning rate to 6.12993e-06 and starting again. step 1 - lrate 0.000006, wchange 267865.16117270, angledelta 0.0 deg step 2 - lrate 0.000006, wchange 72015507379.29844666, angledelta 0.0 deg step 3 - lrate 0.000005, wchange 1571440464294991.00000000, angledelta 1.5 deg Lowering learning rate to 3.53084e-06 and starting again. step 1 - lrate 0.000004, wchange 1295.63577751, angledelta 0.0 deg step 2 - lrate 0.000004, wchange 1703609.71914725, angledelta 0.0 deg step 3 - lrate 0.000004, wchange 2285997778.79218006, angledelta 8.2 deg step 4 - lrate 0.000003, wchange 684233149650.51892090, angledelta 8.2 deg step 5 - lrate 0.000002, wchange 62489853733454.97656250, angledelta 8.2 deg step 6 - lrate 0.000002, wchange 2173993585102538.25000000, angledelta 8.2 deg Lowering learning rate to 1.30161e-06 and starting again. step 1 - lrate 0.000001, wchange 32.25662348, angledelta 0.0 deg step 2 - lrate 0.000001, wchange 109.31476784, angledelta 0.0 deg step 3 - lrate 0.000001, wchange 1554.83988176, angledelta 60.8 deg step 4 - lrate 0.000001, wchange 20570.23383208, angledelta 0.1 deg step 5 - lrate 0.000001, wchange 277501.83082415, angledelta 0.1 deg step 6 - lrate 0.000001, wchange 3743626.49481232, angledelta 0.1 deg step 7 - lrate 0.000001, wchange 50503232.29269380, angledelta 0.1 deg step 8 - lrate 0.000001, wchange 681311683.70000589, angledelta 0.1 deg step 9 - lrate 0.000001, wchange 9191205769.08075142, angledelta 0.1 deg step 10 - lrate 0.000001, wchange 58214356623.78507233, angledelta 0.1 deg step 11 - lrate 0.000001, wchange 234928667446.58322144, angledelta 0.1 deg step 12 - lrate 0.000001, wchange 659292692186.12036133, angledelta 0.1 deg step 13 - lrate 0.000001, wchange 1381145339471.80273438, angledelta 0.1 deg step 14 - lrate 0.000000, wchange 2287230374670.84765625, angledelta 0.1 deg step 15 - lrate 0.000000, wchange 3136012520331.43750000, angledelta 0.1 deg step 16 - lrate 0.000000, wchange 3695124520795.38183594, angledelta 0.1 deg step 17 - lrate 0.000000, wchange 3855571831406.29443359, angledelta 0.1 deg step 18 - lrate 0.000000, wchange 3649453627319.52539062, angledelta 0.1 deg