SpeechGPT
SpeechGPT copied to clipboard
Performance on smaller models?
Hi,
I am trying to build a multilingual version of the model that you have used, instead of using Llama-7B, I tried training with a much smaller version of Llama which has 1.1B parameters with over 100k hours of audio data in German and English, but it does not seem to be working well in a multilingual setting, especially in German.
My question is it because the LLM has much fewer parameters as compared to Llama-7b?
Thank You