supereeg icon indicating copy to clipboard operation
supereeg copied to clipboard

Optimize code for reconstructing timesamples

Open andrewheusser opened this issue 8 years ago • 3 comments

I did some simulations and it seems like the answer that eq. 10 in the paper returns is not dependent on the number of time samples passed, so we have the option to pass as many time samples as we'd like. may be relevant for tensorflow implementation:

heres the code:

import numpy as np
import seaborn as sns

# unknown
Kba = np.random.rand(100,10)

# known
Kaa = np.random.rand(10,10)
Y = np.random.rand(10,5000)

# reconstruct 100 time samples
ts1 = np.squeeze(
    np.dot(np.dot(Kba, np.linalg.pinv(Kaa)), Y[:,:500]).T)

# reconstruct 2500 time samples
ts2 = np.squeeze(
    np.dot(np.dot(Kba, np.linalg.pinv(Kaa)), Y[:,:2500]).T)

print("TS 1 (one time sample) shape: " + str(ts1.shape))
print("TS 2 (2500 time samples) shape: " + str(ts2.shape))

# are the first 100 samples the same?

sns.plt.plot(ts1[:500,:])
sns.plt.title('500 time sample')
sns.plt.show()
sns.plt.plot(ts2[:500,:])
sns.plt.title('2500 time samples')
sns.plt.show()

sns.plt.scatter(ts1[:500,:], ts2[:500,:])
sns.plt.title('First 500 time samples')
sns.plt.show()

image

image

image

andrewheusser avatar May 18 '17 18:05 andrewheusser

another implementation (i think):

image

andrewheusser avatar May 18 '17 18:05 andrewheusser

re: other implementation-- it looks like that version does some regularization, which could help us. usually regularization helps avoid overfitting and improves generalizability. we may want to (eventually) implement a few different versions of the predict function and explore performance.

jeremymanning avatar May 18 '17 18:05 jeremymanning

👍

andrewheusser avatar May 18 '17 18:05 andrewheusser