Inflight nemo model export support
What does this PR do ?
Adds inflight NeMo to TRTLLM v10 model conversion and engine refitting using device weights
Collection: [Note which collection this PR will affect]
Changelog
- Add specific line by line info of high level changes in this PR.
Usage
- You can potentially add a usage example below
# Add a code snippet demonstrating how to use this
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR. To re-run CI remove and add the label again. To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
- [ ] Make sure you read and followed Contributor guidelines
- [ ] Did you write any new necessary tests?
- [ ] Did you add or update any necessary documentation?
- [ ] Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
- [ ] Reviewer: Does the PR have correct import guards for all optional libraries?
PR Type:
- [ ] New Feature
- [ ] Bugfix
- [ ] Documentation
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed. Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information
- Related to # (issue)
同样需要兼容OpenAI API的模式,Ollama太弱了
In fact, in RAGFlow system , User Setting-Model Providers-Xinference not only supports models deployed by Xinference, but also any OpenAI-API-compatible model. So you can use any OpenAI-API-compatible model by Model Providers-Xinference.
In fact, in RAGFlow system ,
User Setting-Model Providers-Xinferencenot only supports models deployed by Xinference, but also any OpenAI-API-compatible model. So you can use any OpenAI-API-compatible model byModel Providers-Xinference.
This should be added in the README or the documentation somewhere, that's a big deal!
@edisonzf2020 Thanks for your suggestion, and apologies for the late reply! ⏳🙏
Our product now supports OpenAI-compatible API access for other LLMs, and this is also documented in our technical guides. 📘🔌 This should help resolve your issue.
Please feel free to close this feature. If it remains open, we’ll include it in our upcoming round of issue cleanups. 🧹
Thanks again for your helpful feedback — we truly appreciate your contributions! 💡🚀