BentoML
BentoML copied to clipboard
feature(frameworks): bentoml.onnx runner accept kwargs
What does this PR address?
Sometimes it's more nature to call onnx model with keyword arguments. For example, bert's tokenizer will output a dictionary, which the original model will call with model(**input)
. Our runner should simulate this behavior
Fixes #(issue)
Before submitting:
- [ ] Does the Pull Request follow Conventional Commits specification naming? Here are GitHub's guide on how to create a pull request.
- [ ] Does the code follow BentoML's code style, both
make format
andmake lint
script have passed (instructions)? - [ ] Did you read through contribution guidelines and follow development guidelines?
- [ ] Did your changes require updates to the documentation? Have you updated those accordingly? Here are documentation guidelines and tips on writting docs.
- [ ] Did you write tests to cover your changes?
Codecov Report
Merging #3561 (607ec46) into main (afe9660) will not change coverage. The diff coverage is
0.00%
.
@@ Coverage Diff @@
## main #3561 +/- ##
=====================================
Coverage 0.00% 0.00%
=====================================
Files 154 154
Lines 12620 12632 +12
=====================================
- Misses 12620 12632 +12
Impacted Files | Coverage Δ | |
---|---|---|
src/bentoml/_internal/frameworks/onnx.py | 0.00% <0.00%> (ø) |
cc @larme, I ran the test locally, it passed, but it still didn't run on CI. can you also fix it as well?
@aarnphm Now the tests run on CI. The problem is that the new bert tests will missing the pytorch dependencies (I thought installing transformers will install torch by default, I was wrong).
@larme @aarnphm what's the status of this one?
This has been on this list for a while now @larme we should circle back and get this one merge