kaldi-gstreamer-server
kaldi-gstreamer-server copied to clipboard
AttributeError: 'NoneType' object has no attribute 'binary_message
Exception in thread Thread-28:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "/e/ashwinraju/kaldi_asr/asr/py-kaldi/tests/client.py", line 58, in send_data_to_ws
self.send_data(block)
File "/e/ashwinraju/kaldi_asr/asr/py-kaldi/tests/client.py", line 22, in rate_limited_function
ret = func(*args,**kargs)
File "/e/ashwinraju/kaldi_asr/asr/py-kaldi/tests/client.py", line 43, in send_data
self.send(data, binary=True)
File "/usr/local/lib/python2.7/dist-packages/ws4py/websocket.py", line 257, in send
message_sender = self.stream.binary_message if binary else self.stream.text_message
AttributeError: 'NoneType' object has no attribute 'binary_message
I am calculating predictions for a list of test samples in a multiprocessing way. I am able to predict the output but still am gettting this AttributeError . Any idea to avoid this error.
Please clarify your issue. It's difficult to understand where and when this problem occurs.
i have tried this and got the same type of problem in decoder. and i am trying: "Using the 'onlinegmmdecodefaster' based worker"
python kaldigstserver/worker.py -u ws://localhost:8888/worker/ws/speech -c sample_english_nnet2.yaml
DEBUG 2018-01-28 12:35:12,871 Starting up worker
2018-01-28 12:35:12 - INFO: decoder2: Creating decoder using conf: {'post-processor': "perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} s/(.*)/\1./;'", 'logging': {'version': 1, 'root': {'level': 'DEBUG', 'handlers': ['console']}, 'formatters': {'simpleFormater': {'datefmt': '%Y-%m-%d %H:%M:%S', 'format': '%(asctime)s - %(levelname)7s: %(name)10s: %(message)s'}}, 'disable_existing_loggers': False, 'handlers': {'console': {'formatter': 'simpleFormater', 'class': 'logging.StreamHandler', 'level': 'DEBUG'}}}, 'use-nnet2': True, 'full-post-processor': './sample_full_post_processor.py', 'decoder': {'ivector-extraction-config': 'test/models/english/tedlium_nnet_ms_sp_online/conf/ivector_extractor.conf', 'num-nbest': 10, 'lattice-beam': 6.0, 'acoustic-scale': 0.083, 'do-endpointing': True, 'beam': 10.0, 'max-active': 10000, 'fst': 'test/models/english/tedlium_nnet_ms_sp_online/HCLG.fst', 'mfcc-config': 'test/models/english/tedlium_nnet_ms_sp_online/conf/mfcc.conf', 'use-threaded-decoder': True, 'traceback-period-in-secs': 0.25, 'model': 'test/models/english/tedlium_nnet_ms_sp_online/final.mdl', 'word-syms': 'test/models/english/tedlium_nnet_ms_sp_online/words.txt', 'endpoint-silence-phones': '1:2:3:4:5:6:7:8:9:10', 'chunk-length-in-secs': 0.25}, 'silence-timeout': 10, 'out-dir': 'tmp', 'use-vad': False}
Traceback (most recent call last):
File "kaldigstserver/worker.py", line 366, in
Does anyone solves this problem?
@Alif112 Do you have sloved this problem?
@wujsy its long time ago, i dont remember much, but i would like to assure u that using docker will help u a lot. And docker image was bug free.