Jack Lanchantin
Jack Lanchantin
@codertimo the BERT positional embedding method is to just learn an embedding for each position. So you can use nn.Embedding with a constant input sequence [0,1,2,...,L-1] where L is the...
I don't have access to the raw data anymore as my home directory was removed from the university servers. I believe preprocess.py expects a train_input.txt file where each line is...
The unnormalized outputs will be in the `output` variable [here](https://github.com/QData/DeepChrome/blob/master/deepChrome-TorchCode/4_train.lua#L125) You can append a [nn.SoftMax](https://github.com/torch/nn/blob/master/SoftMax.lua) module to the model in order to get normalized probabilities. btw - have you tried...
you can do `normalized_output = nn.SoftMax()(output)` `output` is a [torch tensor](https://torch7.readthedocs.io/en/rtd/tensor/index.html) normalized_output[:,0] = p(x=true) normalized_output[:,1] = p(x=false) you can write each of these to a csv file using standard lua...
I don't remember what dimension normalized_output would be. Can you try removing the `:,` ?
oh you shouldn't use `= p(x=true)`, i was explaining what those will give you - i.e. the probability that the input is has expression=true
Are you seeing the same results on the datasets from our experiments? If not, it's likely something wrong with your data.
Can you access the data at [https://www.cs.virginia.edu/yanjun/jack/vision/coco.tar.gz](https://www.cs.virginia.edu/yanjun/jack/vision/voc.tar.gz)? Let me know if you can't figure it out using that as a guide
Can you share the command line you are using? Are you using GPUs?
And were you able to run the commands in the README?