skip-thoughts icon indicating copy to clipboard operation
skip-thoughts copied to clipboard

floating point error in encoding

Open satarupaguha11 opened this issue 8 years ago • 7 comments

I have successfully trained skipthoughts from scratch on my own data. Now given new data X, I want to get its vectors representation. So i do the following

vectors = skipthoughts.encode(model, X)

But this gives me the following error:

floating point exception (core dumped)

I have also set the THEANO_FLAGS floatX to float32. Any help will be appreciated.

satarupaguha11 avatar May 09 '16 09:05 satarupaguha11

Not an expert but most likely you're getting memory issues. i think you are running out of memory.

lovekesh-thakur avatar May 26 '16 05:05 lovekesh-thakur

Having the same issue on a GTX Titan X and 32 gigs of ram with a rather small sentence corpus (around 37k sentences). Any idea what the problem could be? I'm not getting any information on the error.

Flags I am using

THEANO_FLAGS=floatX=float32,device=gpu,nvcc.flags=-D_FORCE_INLINES

I can't run more than 1000 sentences at at a time. Any idea what is happening here?

polyrhythmatic avatar Jul 20 '16 22:07 polyrhythmatic

X should be a list, not string. You may forget '[ ]'.

clovercoser avatar Aug 17 '16 06:08 clovercoser

We got this behavior sometimes from inadvertantly sending in empty strings (after overzealous preprocessing). It was always hard to pinpoint so we're not sure we fixed it but that might be something to check.

pcallier avatar Oct 28 '16 23:10 pcallier

I got the same error. Turns out one of the sentences in the list was just whitespace, i.e. " ". So, after the internal preprocessing in the skipthoughts.py script, it came up to be an empty string which might have been the reason for the error. Hope this helps!

Spider101 avatar Apr 13 '17 23:04 Spider101

Traceback (most recent call last): File "train.py", line 8, in vectors=encoder.encode(sent_tok) File "/local_home/rpr/code/skipthoughts.py", line 102, in encode return encode(self._model, X, use_norm, verbose, batch_size, use_eos) File "/local_home/rpr/code/skipthoughts.py", line 132, in encode for minibatch in range(numbatches):

please help me

pavanpankaj avatar May 03 '19 12:05 pavanpankaj

3 128 Traceback (most recent call last): File "train.py", line 8, in vectors=encoder.encode(sent_tok) File "/local_home/rpr/code/skipthoughts.py", line 102, in encode return encode(self._model, X, use_norm, verbose, batch_size, use_eos) File "/local_home/rpr/code/skipthoughts.py", line 132, in encode for minibatch in range(numbatches):

I thought only 3 words in dictionary are found. In batches it is going on and the batch size is 128. only 3 in a batch found so 3/128 is float range(float) is an error it is showing

pavanpankaj avatar May 03 '19 12:05 pavanpankaj