FBGEMM
FBGEMM copied to clipboard
Make split_embedding_weights_with_scale_bias(2) scriptable
Summary:
Some infra like xlweights is using split_embedding_weights_with_scale_bias for torch scripted modules,
mode==2 fails in torch script due to .view(torch.float16)
RuntimeError: The following operation failed in the TorchScript interpreter.
Traceback of TorchScript (most recent call last):
File "/mnt/xarfuse/uid-128598/ebee08ee-seed-nspid4026531836_cgpid9192421-ns-4026531840/fbgemm_gpu/split_table_batched_embeddings_ops_inference.py", line 1202, in split_embedding_weights_with_scale_bias
(
weights_shifts[:, self.scale_bias_size_in_bytes :],
weights_shifts[:, : self.scale_bias_size_in_bytes // 2].view(torch.float16),
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
weights_shifts[
:,
Differential Revision: D48879020
Privacy Context Container: L1138451
Deploy Preview for pytorch-fbgemm-docs canceled.
| Name | Link |
|---|---|
| Latest commit | ea8374df86a9112b06bc03686105fa8bfb6a2d30 |
| Latest deploy log | https://app.netlify.com/sites/pytorch-fbgemm-docs/deploys/64f0f2f848bda400084f0cdb |
This pull request was exported from Phabricator. Differential Revision: D48879020
This pull request was exported from Phabricator. Differential Revision: D48879020