rubby33

Results 12 comments of rubby33

@shenshen-hungry 谢谢你们,但是在一些特定场景中,句子或话语很短的时候,字向量还是有些必要。辛苦啦

@miangangzhen 后来解决了吗? 如果 redis stream+ flask_multigpu_example 例子在readme里能够更加详尽就更好? 这个例子有点复杂,好多人不是很理解 @Meteorix 多谢。

我也遇到这个问题:详细的报错日志如下 [2020-06-16 15:35:14,884] ERROR in app: Exception on /sentence_type2 [GET] Traceback (most recent call last): File "/data/jiangwei/anaconda3/envs/py3.7/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app response = self.full_dispatch_request() File "/data/jiangwei/anaconda3/envs/py3.7/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request...

> 显是任务超时,把WORKER_TIMEOUT设置大一点应该就没问题了 多谢回复,貌似没有这么简单。 我是用wrk 压测,超时设置为2s,只要发生超时了,service stream的所有的服务请求都无法响应了。Requests/sec:为 0.45。如果使用原生flask naive 方法,是ok的。 (py3.7) jiangwei@mk-Z10PE-D16-WS:~$ wrk -t8 -c100 -d20s --latency http://localhost:5005/sentence_type2?sen=%22%E6%88%91%E4%B8%8D%E8%AE%A4%E5%8F%AF%E8%BF%99%E4%B8%AA%E5%9B%BD%E5%AE%B6%22 Running 20s test @ http://localhost:5005/sentence_type2?sen=%22%E6%88%91%E4%B8%8D%E8%AE%A4%E5%8F%AF%E8%BF%99%E4%B8%AA%E5%9B%BD%E5%AE%B6%22 8 threads and 100 connections Thread Stats Avg...

In fact, my question is: " how to handle large context by Machine Comprehension(bi-att-flow)". I think this is a very interested problem. Thanks.

Thanks. @webeng code work well with smaller chunks of text like the ones used for training. @seominjoon yes, the context has 34680 words. Thanks for the good idea. By the...

An example: Text: the league emphasized the "golden anniversary" with various gold.... After process_tokens() the result are: 'the' 'league' 'emphasized' 'the' '' '"' '' 'golden' 'anniversary' '' '"' '' 'with'...

because graph_handler depends on the model from get_multi_gpu_models(), while the model from get_multi_gpu_models() depends on the input questions and contexts.

"mrc model" i mean the the restored the model by "saver.restore(sess, save_path)" . My question is : I have many json files whose format are the same to the standard...

That's what I need. Thanks a lot. Let me have a try. 👍