OpenChatKit
OpenChatKit copied to clipboard
Training LLaMa?
In theory the LLaMa 30b & 65b should be much more capable than the GPT-NeoX 20b.
Does OpenChatKit support LLaMa? If not, is it on the roadmap?
I appreciate that togethercomputer might not be able to release pretrained LLaMa weights due to the licence, but it'd be great if researches can at least play with it.
same question
same request here
OpenChatKit is a properly free, open-source model under a permissive licence.
Meta's LLaMa is none of those things, and anything built on top of it won't be either.
I don't see what kind of interaction between those 2 models would be helpful.