transformerlab-app icon indicating copy to clipboard operation
transformerlab-app copied to clipboard

It's not clear how to start an inference server via API

Open aliasaria opened this issue 1 year ago • 0 comments

We have a /worker/start endpoint in the API but it doesn't allow you to set the inference engine or inference parameters.

It's also not clear what "model_filename" refers to in the app since we refer to the unique id of the model in several ways across the API (uniqueId, filename, huggingface_id).

aliasaria avatar Feb 26 '24 00:02 aliasaria