ChitandaErumanga

Results 3 comments of ChitandaErumanga

when i tryed to use inputs_embeds, ive used both input_ids and inputs_embeds while setting ``` transformer_outputs = self.model( None,#input_ids attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, inputs_embeds=inputs_embeds, use_cache=use_cache, output_attentions=output_attentions, output_hidden_states=output_hidden_states, return_dict=return_dict, ) ``` in...

> Hey! I think your are missing: > > ```diff > llama_model = LlamaForSequenceClassification.from_pretrained( > "../Meta-Llama-3.2-1B-Instruct", > num_labels=1, > torch_dtype=torch.bfloat16 > + pad_token_id=12000 > ) > ``` > > with...

> but the embedding matrix is initialized before you set the config. pad token id Thank u so much, this really solved the problem with my code