gpt4all
gpt4all copied to clipboard
Please add inkubaLM
InkubaLM has been trained from scratch using 1.9 billion tokens of data for five African languages, along with English and French data, totaling 2.4 billion tokens of data. This would such an amazing Large Model to have on GPT4All.
You can find model details here: InkubaLM
You can find the blogpost here: InkubaLM Blogpost