juney-nvidia
juney-nvidia
@anand-nv Hi, TRT-LLM has already moved its development to github for now. Can you rebase your MR based on the latest main branch to prepare a fresh MR? Thanks June
@BugsBuggy Hi, can you try with the latest TRT-LLM to see whether the issue still exist? There are recent fixes as to MoE related kernels. Thanks June
@QiJune @ming-wei pls help review this MR.
> * blossom-ci @aikitoria you code failed to pass the pre-commit check. Currently the pre-commit check failure will not be copied back to public to be viewable and we are...
> Thank you for the contribution! > > I've left a few comments, but the PR looks overall good. > > @juney-nvidia It'd be good if we can find someone...
> @BasicCoder Thank you for your interest in TrtLLm. We have fixed it in our internal version, and it will be updated on GitHub soon. @hello-11 Can you help confirm...
Thanks @aikitoria for contributing this. We will make the first pass of this MR to provide early feedback also. Thanks June
> Can we suggest open sourcing all of the kernels as a topic to discuss? :) > > It will be easier to adapt the library to new models with...