transformers.js icon indicating copy to clipboard operation
transformers.js copied to clipboard

Why do certain models not load?

Open adaboese opened this issue 2 years ago • 2 comments

Question

I was keen to try:

https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0

I tried:

import {
  AutoModelForCausalLM,
  AutoTokenizer,
} from '@xenova/transformers';

const autoTokenizer = await AutoTokenizer.from_pretrained(
  'Upstage/SOLAR-10.7B-Instruct-v1.0',
);

const model = await AutoModelForCausalLM.from_pretrained(
  'Upstage/SOLAR-10.7B-Instruct-v1.0',
);

But it fails with an error:

Error: Could not locate file: "https://huggingface.co/Upstage/SOLAR-10.7B-Instruct-v1.0/resolve/main/onnx/decoder_model_merged_quantized.onnx".

Is this an error on my side, is the model incompatible, ... ?

adaboese avatar Dec 27 '23 01:12 adaboese

You should convert the model to ONNX.

https://github.com/xenova/transformers.js?tab=readme-ov-file#convert-your-models-to-onnx

hans00 avatar Dec 27 '23 09:12 hans00

Hi there 👋 As @hans00 pointed out, transformers.js requires the models to be converted to ONNX in order to work. Another complication, however, is the choice of model: Upstage/SOLAR-10.7B-Instruct-v1.0 is a 10.7B parameter model, which is too large to run with the library (at least for now).

xenova avatar Jan 02 '24 21:01 xenova