beiller

Results 43 comments of beiller

Hey, Build.sh was meant to run in the docker container, but I guess you can also just run it. I saw this error before as well and I cannot remember...

Wow @evandarcy you need to give us a Pull Request with your bullet3! edit Any problems you run into compiling with bullet3 or is it just drop it in?

Its unable to support emojis without the unicode support fix. I have a branch or we can wait until more work is done here. PR: https://github.com/ggerganov/llama.cpp/pull/66 Branch: https://github.com/beiller/llama.cpp/tree/feature/tokenization

This is also not working for me in require js... Old version worked... define(['lib/state-machine', 'lib/three'], function(StateMachine, THREE) { //StateMachine is undefined It appears to have to do with whatever shim,...

I tried to determine how to implement unicode and I am not getting far. It seems to work from all I am seeing, but the output has random characters yes....

I find a list of unprintable tokens from ID 131 to 258. If I remove those from vocab a prompt can generate in Japanese it seems but I dont know...

I removed "", "�", "��" from the grammar, not from a sentence that's not how it works. There is a large chunk of the "token dictionary" in the model that...

For anyone interested here is the chunk in the 13B model file. Not sure if all models contain the same token grammars ``` � vocab[131] (EFBFBD) � vocab[132] (EFBFBD) �...

Im not sure you have applied the code change. I cannot try your prompt since its an image mind pasting? But I think you have to checkout from my repo...

Heres some more examples: `./main -m ./models/13B/ggml-model-q4_0.bin -t 8 -p $'人生の意味は' ` Outputs: ``` ... main: prompt: '人生の意味は' main: number of tokens in prompt = 5 1 -> '' 30313...