Qinlong Wang

Results 18 comments of Qinlong Wang

> We should clean up the temp at the end of each step, so we need three steps at leas for each job: > > 1. create a temp table...

There is an [torchServe introduction](https://aws.amazon.com/about-aws/whats-new/2020/04/introducing-torchserve/?nc1=h_ls) given by AWS, "TorchServe delivers lightweight serving with low latency, so you can deploy your models for high performance inference." But I don't find any...

> **The key challenge is: how to save the preprocess logic into the serialized model for serving (TorchScript or ONNX).** > > Proposal Options: > > * Develop some custom...

We can check in the pre-compiled binary files of PS in the repo. In the building `.whl` process, we can directly use the binary files.

> You can delete unnecessary files after cloning TensorFlow in the same RUN command. It will both keep the image small and avoid wget network issue. Good suggestion!

Can we use TensorFlow Dataset APIs to read data and feed the data into Pytorch models? For example: ```python for features, labels in dataset: features = features.numpy() labels = labels.numpy()...

> @workingloong Any update on this issue? If not, I can try to fix the problem. No any.

> After a further investigation into the code base, it seems that the `err_msg`'s length is used to indicate the status of a mini-batch. If any worker reports a non-empty...

Why do not we remove the module of python PS ?