transformers-php icon indicating copy to clipboard operation
transformers-php copied to clipboard

Using with a custom model not on HuggingFace

Open coogle opened this issue 7 months ago • 11 comments

Your question

This is an awesome project, kuddos to you. I hope you can help me use it!

I've created a custom model for multi text classification that I've exported as an .onnx file (originally made in sklearn). I've tried to import this file a bunch of different ways to use it and I've thus far been unsuccessful.

Can you provide some guidance about how one might use a custom model? I can't find much useful documentation on how to put together config.json and such needed here... plus, it seems like most of the documentation right now is geared toward models published on HuggingFace -- rather than proprietary custom models.

End goal here is I have a bunch of training data that takes NLP and assigns it 0 or more custom (known) tags. I'd like to be able to use this model in a PHP request to create tags from input text.

Context (optional)


model = clr.fit(
    cv_clean.transform(X_train_clean),
    y_train
)

initial_type = [('float_input', FloatTensorType([None, cv_clean.transform(X_train_clean).shape[1]]))]

onnx_model = convert_sklearn(model, initial_types=initial_type, target_opset=20, options={type(clr.estimator): {'zipmap': False}})

onnx_model_path = "model_quantized.onnx"
with open(onnx_model_path, "wb") as f:
    f.write(onnx_model.SerializeToString())

Reference (optional)

No response

coogle avatar Jul 13 '24 00:07 coogle