social-lstm
social-lstm copied to clipboard
loss computing in sample_validation_data
Hi, ret_x_seq, loss = sample_validation_data(x_seq, PedsList_seq, grid_seq, args, net, lookup_seq, numPedsList_seq, dataloader)
loss = Gaussian2DLikelihood(out_[0].view(1, out_.size()[1], out_.size()[2]), x_seq[tstep].view(1, numx_seq, 2), [Pedlist[tstep]], look_up)
why use x_seq[tstep] to compute loss, why not use x_seq[tstep+1].?
As far as I remember, it is because I am vectorizing it (subtract the first frame to all frames so every time first position is 0) however, I am not remembering clearly. I need to print variables and length of each sequence but I am unable to debug right now. Can you do it?
Ok, I will spend some time learning your code. By the way, May i ask that which result/rank was computed by this code in the World Plane Human-Human Dataset challenge(http://trajnet.stanford.edu/result.php?cid=1). social lstm_v2 or v3 ? Thanks!
both of them.
I wonder the code is run under windows or ubuntu?
I have experimented on Ubuntu 16.04, never tried on Windows.
I wonder why you clone the x_seq to ret_x_seq in the sample function under test.py? That means the error between predict point the to original point is zero. How did you get the simulation result?
@william-yan please check the issue https://github.com/quancore/social-lstm/issues/6
During the training, I think the model is actually trying to predict the current location instead of the next time step given the current location.