christ1ne
christ1ne
WG: PT is no longer using the ONNX for the internal graph format. We need to have an updated PT model in this case.
WG: we may add a PT model next week. Please see the mailing list for updates next week.
@guschmue shall I just close this one?
thanks @jqueguiner for your contribution
@jqueguiner did your company sign the CLA? If so, please fill https://forms.gle/Ew1KkBVpyeJDuRw67
I end up just traing my own GNMT model. Afterwards, you can run the inference code with your own model without issue. I suspect the vocab file generation code for...
> I had the same problem with the provided model: http://download.tensorflow.org/models/nmt/10122017/ende_gnmt_model_8_layer.zip > I used the script: https://github.com/tensorflow/nmt/blob/master/nmt/scripts/wmt16_en_de.sh > The resulted vocabulary had 36549 elements while the pre-trained model has 36548!...
@gaebor the vocab generation is here: download_dataset.sh at https://github.com/mlperf/training/tree/master/rnn_translator
The vocab generation scripts are different at MLPerf and at the TF nmt repo.
> The [README](https://github.com/mlperf/inference_results_v0.5/blob/master/closed/DellEMC/code/ssd-small/openvino-linux/README.md) for SSD-Small using OpenVINO in the DellEMC submission, says: > > > Run the corresponding scripts under `scripts/` folder > > [This folder](https://github.com/mlperf/inference_results_v0.5/tree/master/closed/DellEMC/code/ssd-small/openvino-linux/scripts), however, contains two files:...