Chaoyu

Results 65 comments of Chaoyu
trafficstars

Hi @xytsinghua - you can use the `include` field in `bentofile.yaml` to allow the Bento to include your model file. See https://docs.bentoml.com/en/latest/guides/build-options.html#include

@pzz2011 https://github.com/tensorflow/tensorflow/tree/master/tensorflow/stream_executor

Tried the following workaround and it seems to work correctly. Although I haven't tested it extensively: https://github.com/protocolbuffers/protobuf/compare/master...parano:js-fromobject-hack?expand=1 First add the following line to js_generator.cc: ``` diff --git a/src/google/protobuf/compiler/js/js_generator.cc b/src/google/protobuf/compiler/js/js_generator.cc index...

You can use BentoML for those models, which allow user to build custom, multi-modal APIs. It's possible to support those models in OpenLLM in the future depending on community demand.

Hi @hugocool - could you share your `bentofile.yaml`? Did you include the `./config/default.yaml` file in the `include` section?