rust-bert icon indicating copy to clipboard operation
rust-bert copied to clipboard

How to load the quantized sentence transformer model?

Open vimal-quilt opened this issue 1 year ago • 2 comments

Hi, Thanks for your wonderful effort in keeping this crate live and updated. I just tested the updated version for the sentence embedding pipeline. it works awesome with the standard models. when i try run a quantized model (of course i converted to the ot model using the utils script). i am experiencing the following error. will u support me how to fix this. Error: Tch tensor error: cannot find the tensor named encoder.layer.4.attention.self.query.weight

vimal-quilt avatar Jul 25 '22 15:07 vimal-quilt

Hello @vimal-quilt ,

The conversion utilities print the converted tensors and their path as a result of the conversion. Could you please double check if the tensor was exported with the correct path (or share the output of the conversion script)?

guillaume-be avatar Jul 31 '22 15:07 guillaume-be

Hello @guillaume-be. Thank you for your response above, I understand it's been quite some time already, but I'm facing a very similar error when I'm trying to use a custom not quantized model for the QA task. I saw your response for the issue #289 and followed the steps. This is my code below:

extern crate anyhow;

///use rust_bert::pipelines::question_answering::{QaInput, QuestionAnsweringModel};
use rust_bert::mobilebert::{MobileBertConfigResources, MobileBertModelResources, MobileBertVocabResources};
use rust_bert::pipelines::common::{ModelResource, ModelType};
use rust_bert::pipelines::question_answering::{QaInput, QuestionAnsweringConfig, QuestionAnsweringModel};
use rust_bert::resources::RemoteResource;


fn main() -> anyhow::Result<()> {
    //    Set-up Question Answering model
    let config_resource = Box::new(RemoteResource::from_pretrained(
        MobileBertConfigResources::MOBILEBERT_UNCASED,
    ));
    let vocab_resource = Box::new(RemoteResource::from_pretrained(
        MobileBertVocabResources::MOBILEBERT_UNCASED,
    ));
    let model_resource = ModelResource::Torch(Box::new(RemoteResource::from_pretrained(
        MobileBertModelResources::MOBILEBERT_UNCASED,
    )));

    let qa_config = QuestionAnsweringConfig {
        model_type: ModelType::MobileBert,
        model_resource,
        config_resource,
        vocab_resource,
        ..Default::default()
    };

    let qa_model = QuestionAnsweringModel::new(qa_config)?;

    //    Define input
    let question_1 = String::from("Where does Amy live ?");
    let context_1 = String::from("Amy lives in Amsterdam");
    let question_2 = String::from("Where does Eric live");
    let context_2 = String::from("While Amy lives in Amsterdam, Eric is in The Hague.");
    let qa_input_1 = QaInput {
        question: question_1,
        context: context_1,
    };
    let qa_input_2 = QaInput {
        question: question_2,
        context: context_2,
    };

    //    Get answer
    let answers = qa_model.predict(&[qa_input_1, qa_input_2], 1, 32);
    println!("{answers:?}");
    Ok(())
}

and this is my error message:

Error: Tch tensor error: cannot find the tensor named qa_outputs.weight in /home/....

Can you please advice what I should do in this case? I'm pretty new to rust and am also pretty lost at the moment there :D

stanislav-chekmenev avatar Feb 23 '24 20:02 stanislav-chekmenev