LLMFarm
LLMFarm copied to clipboard
Spews complete nonsense after any prompt
Tried StableLM model with llama inference. This is an answer to "hi!"
{<E> ========== [EXPL] | 1) {prompt}
-
- (1, "2") [/EXPL]}; template <class E, class S> struct test_pair { typedef pair<E,S> type; };
int main(void) {
#define PRINT0(x) cout << #x ": " << x << endl
//PRINTTEST
test_pair<short int, string> a;
cout << "sizeof(std::pair <char, std::basic_string
I have not found any way to get any output that wouldn't be total BS.