coremltools
coremltools copied to clipboard
Add clamp_min to PyTorch ops
This op is required to convert torch.nn.functional.normalize()
Thank @pokidyshev for the PR. The op definition looks good.
In order to merge this PR, please add a unit test for clamp_min
to test_torch_ops.py.
@TobyRoseman not sure how to do that... Can you point out a good example? I'm currently looking at https://github.com/pokidyshev/coremltools/blob/7a4bf8bb6e0142565ef1c5ea6c56d75b959911c2/coremltools/converters/mil/frontend/torch/test/test_torch_ops.py#L2656
@TobyRoseman not sure how to do that... Can you point out a good example?
Take a look at the unit test for torch.argsort
.
Great op! Thanks @pokidyshev
Closing this in favor or #1584