dr-baem

Results 4 comments of dr-baem

I experienced the same issue: https://github.com/go-skynet/LocalAI/issues/1114

yourtiger seems to have found a valid solution. Can someone propose a permanent Pull/Merge Request to fix the issue?

The lama model we tried to use the embedding of is: llama-2-7b-chat.ggmlv3.q4_K_M.bin

This gives me some idea on how to use MaxReceiveMessageLength. For examples one could add s.MaxReceiveMessageLength = 100000000 in line 176 of https://github.com/go-skynet/LocalAI/blob/master/pkg/grpc/server.go could be an option. I don't know...