TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

Add index put

Open HelloShirleyHello opened this issue 3 years ago • 3 comments

Description

user wants to convert following torch script to trt engine.

class Net(nn.Module):
    def __init__(self):
        super(Net2, self).__init__()

    def forward(self, x):
        y = torch.zeros(1,1,224)
        x[:, 1, 1, :]=y
        return x

trt can't support in-place memory update right now. So I recommend them to use "index_put" op, which will store the result in new memory.

This pull request is for support index_put in trt. I use scatterND op in tensorrt to support index_put op in pytorch. IScatterLayer has three input tensors: Data D , Indices I , and Updates U.

My code include:

  1. convert index_put indices to int32 type.
  2. broadcast all the indices into same shape and unsqueeze last dimension.
  3. concat all the indices on last dimension to get "I" for scatter layer.
  4. compute the expected "U" tensor shape by "D" and "I".
  5. broadcast input update tensor to "U".
  6. add scatter layer when accumulate is false in index_put.
  7. add gather layer and scatter layer when accumulate is true in index_put.

Fixes # (issue)

Type of change

  • New feature (non-breaking change which adds functionality)

Checklist:

  • [x] My code follows the style guidelines of this project (You can use the linters)
  • [x] I have performed a self-review of my own code
  • [x] I have commented my code, particularly in hard-to-understand areas and hacks
  • [ ] I have made corresponding changes to the documentation
  • [x] I have added tests to verify my fix or my feature
  • [x] New and existing unit tests pass locally with my changes
  • [x] I have added the relevant labels to my PR in so that relevant reviewers are notified

HelloShirleyHello avatar Sep 08 '22 10:09 HelloShirleyHello

Hi @HelloShirleyHello!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

facebook-github-bot avatar Sep 08 '22 10:09 facebook-github-bot

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

facebook-github-bot avatar Sep 08 '22 12:09 facebook-github-bot

@HelloShirleyHello Can you setup the pre-commit system (pre-commit --install && pre-commit run --all-files) to apply the lint for the repo.

narendasan avatar Sep 08 '22 20:09 narendasan

@HelloShirleyHello Can you address the review comments and fix the tests ?

peri044 avatar Feb 13 '23 22:02 peri044

This PR has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days

github-actions[bot] avatar May 15 '23 00:05 github-actions[bot]