Macaw-LLM
Macaw-LLM copied to clipboard
How to get the whisper, clip, and llama model used by macaw?
I used the following code to get the pretrained models:
from transformers import CLIPModel, LlamaModel
clip_model = CLIPModel.from_pretrained("openai/clip-vit-base-patch16")
from transformers import WhisperForConditionalGeneration
whisper_model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base")
llama7b_model = LlamaModel.from_pretrained("decapoda-research/llama-7b-hf")
clip_model.save_pretrained('trained_models/clip_model/')
whisper_model.save_pretrained('trained_models/whisper_model/')
llama7b_model.save_pretrained('trained_models/llama7b_model/')
Is this correct?
I also want to know how to get the whisper, clip, and llama model used by macaw?
I used the following code to get the pretrained models:
from transformers import CLIPModel, LlamaModel clip_model = CLIPModel.from_pretrained("openai/clip-vit-base-patch16") from transformers import WhisperForConditionalGeneration whisper_model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base") llama7b_model = LlamaModel.from_pretrained("decapoda-research/llama-7b-hf") clip_model.save_pretrained('trained_models/clip_model/') whisper_model.save_pretrained('trained_models/whisper_model/') llama7b_model.save_pretrained('trained_models/llama7b_model/')
Is this correct?
Hello, have you run through this code? I encountered the following error:
assert self.head_dim * num_heads == self.embed_dim, "embed_dim must be divisible by num_heads" AssertionError: embed_dim must be divisible by num_heads
Hi,
I used the following code to get the pretrained models:
from transformers import CLIPModel, LlamaModel clip_model = CLIPModel.from_pretrained("openai/clip-vit-base-patch16") from transformers import WhisperForConditionalGeneration whisper_model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-base") llama7b_model = LlamaModel.from_pretrained("decapoda-research/llama-7b-hf") clip_model.save_pretrained('trained_models/clip_model/') whisper_model.save_pretrained('trained_models/whisper_model/') llama7b_model.save_pretrained('trained_models/llama7b_model/')
Is this correct?
Hello, have you run through this code? I encountered the following error:
assert self.head_dim * num_heads == self.embed_dim, "embed_dim must be divisible by num_heads" AssertionError: embed_dim must be divisible by num_heads
Have U managed to run macaw-llm?
I have the same issue. I found that in run_clm_lls.py
files, attention_heads defaults to 220. How did you solve it, please
Me,too.Any updates?