OpenChatKit icon indicating copy to clipboard operation
OpenChatKit copied to clipboard

Training LLaMa?

Open wywywywy opened this issue 2 years ago • 3 comments

In theory the LLaMa 30b & 65b should be much more capable than the GPT-NeoX 20b.

Does OpenChatKit support LLaMa? If not, is it on the roadmap?

I appreciate that togethercomputer might not be able to release pretrained LLaMa weights due to the licence, but it'd be great if researches can at least play with it.

wywywywy avatar Mar 14 '23 18:03 wywywywy

same question

hujunchao avatar Mar 15 '23 08:03 hujunchao

same request here

taomanwai avatar Mar 16 '23 13:03 taomanwai

OpenChatKit is a properly free, open-source model under a permissive licence.

Meta's LLaMa is none of those things, and anything built on top of it won't be either.

I don't see what kind of interaction between those 2 models would be helpful.

itsnotlupus avatar Mar 17 '23 03:03 itsnotlupus