owen Lu

Results 3 comments of owen Lu

Hopefully Bert serving can be compatible with TF-2.0, right now, the workaround for me is to serve the bert model in a different server with TF-1.14

Is this issue solved? I'm having the same problem when serving a OpenNMT tensorflow model. I have configured the --rest_api_num_threads=1000 and --grpc_channel_arguments=grpc.max_concurrent_streams=1000 they just won't work somehow, the tensorflow server...

Hmmmmm, 2 years have passed, looks like this is still an issue.