Cheni Chadowitz

Results 40 comments of Cheni Chadowitz

Prior to my new test, I had `service_predict_native_bw` (and others) failing when I ran `ut_torch` - is that expected if I'm not running with GPU support? ```[ RUN ] torchapi.service_predict_native_bw...

I've tried running the tests with the changes in [4d010a6](https://github.com/jolibrain/deepdetect/pull/1448/commits/4d010a6b78f3f54073c7d8c483f3f84ca7e9f754) and can't seem to get the param to actually affect the `predict()` call correctly. It's like `_multi_label` isn't actually saved/stored...

LGTM! Still getting a failure on `torchapi.service_predict_native_bw` due to running the tests without a GPU, I assume that's expected?

One additional thing - I have a second torch model that includes a softmax as part of its own `forward()` method: ``` InceptionResnetV2_Logits_Predictions = F.softmax(InceptionResnetV2_Logits_Logits_MatMul) return InceptionResnetV2_Logits_Predictions ``` Obviously if...

> Our current policy is to export all the models for a given task with the same input / output format, so that there is one unique path in dede....

Cool thanks! I may not be able to get to it today but I'll test it out and let you know.

> Hi @cchadowitz-pf, it would be better if you put the dlib update in another PR, so that we can merge this one. Sounds good - I've reverted the dlib...

> @cchadowitz-pf Hello, let's squash the commits and rebase ! @beniz I'm not sure if I did the squash+rebase right.....I suspect not based on what github is showing under 'files...

Thanks @Bycob ! That looks great to me. I don't have any further changes for this dlib PR. I can't see what the GPU build error is but if it...

@Bycob not sure if you changed something but seems like all the checks have now passed?