transformers.js
transformers.js copied to clipboard
Why do certain models not load?
Question
I was keen to try:
https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0
I tried:
import {
AutoModelForCausalLM,
AutoTokenizer,
} from '@xenova/transformers';
const autoTokenizer = await AutoTokenizer.from_pretrained(
'Upstage/SOLAR-10.7B-Instruct-v1.0',
);
const model = await AutoModelForCausalLM.from_pretrained(
'Upstage/SOLAR-10.7B-Instruct-v1.0',
);
But it fails with an error:
Error: Could not locate file: "https://huggingface.co/Upstage/SOLAR-10.7B-Instruct-v1.0/resolve/main/onnx/decoder_model_merged_quantized.onnx".
Is this an error on my side, is the model incompatible, ... ?
You should convert the model to ONNX.
https://github.com/xenova/transformers.js?tab=readme-ov-file#convert-your-models-to-onnx
Hi there 👋 As @hans00 pointed out, transformers.js requires the models to be converted to ONNX in order to work. Another complication, however, is the choice of model: Upstage/SOLAR-10.7B-Instruct-v1.0 is a 10.7B parameter model, which is too large to run with the library (at least for now).