hebel
hebel copied to clipboard
Small documentation enhancement request
Hi there, I really appreciate Hebel. It was a good first step for me to "take the plunge" into using GPU.
I struggled a bit after going through the example (MNIST) script. In particular, it wasn't clear how to have the model predict new data (i.e., data you don't have targets for).
The first (small) stumble was what to with the DataProvider. I just put in dummy zero targets. Perhaps targets
could be an optional field somehow?
A more thorny issue was how to actually do the predictions. I couldn't for the life of me figure out how to feed the DataProvider data into the feed_forward without getting the error:
File "/usr/local/lib/python2.7/dist-packages/hebel/models/neural_net.py", line 422, in feed_forward
prediction=prediction))
File "/usr/local/lib/python2.7/dist-packages/hebel/layers/input_dropout.py", line 96, in feed_forward
return (input_data * (1 - self.dropout_probability),)
TypeError: unsupported operand type(s) for *: 'MiniBatchDataProvider' and 'float'
This was my original attempt:
# After loading in the data . . .
Xv = Xv.astype(np.float32)
yv = pd.get_dummies(yv).values.astype(np.float32)
valid_data = MiniBatchDataProvider(Xv, yv, batch_size=5000)
I finally resorted to useing a gpu array which worked:
from pycuda import gpuarray
valid_data = gpuarray.to_gpu(Xt)
y_pred = model.feed_forward(valid_data, return_cache=False, prediction=True).get()
The .get()
at the end of the last statement was also something I had to figure out going through code.
Having an example in the documentation would be helpful.
The above didn't work when I had a large array I wanted to pass to feed_forward
. This is what I eventually ended up doing:
#
# Predict
#
Xs = Xs.astype(np.float32)
ys = np.zeros((Xs.shape[0],121)) # 121 = number of classes to predict
test_data = MiniBatchDataProvider(Xs, ys, batch_size=500)
y_pred = None
for mini_data, mini_targ in test_data:
if y_pred != None:
y_mini = model.feed_forward(mini_data, return_cache=False, prediction=True).get()
y_pred = np.vstack((y_pred,y_mini))
else:
y_pred = model.feed_forward(mini_data, return_cache=False, prediction=True).get()
If there's a simpler way, I'd love to know.