Matthew McAllister

Results 14 comments of Matthew McAllister

I wouldn't mind looking into it and opening a PR. I'll have to start with some more research. Eventually someone will likely rely on such a feature in production, and...

@bengarney New lines do add tokens ~@j-f1 Spaces do not add tokens~ The one limitation I see here is that you cannot intentionally add trailing newlines at the end of...

> I don't think this is a good change. Machine Learning models should not teach people how to use text editors. > > What if I want to keep whitespace...

Addressed comments and also added a delete button since the lack of one annoyed me. Character loading currently doesn't support instruct mode any longer. Should that be implemented in this...

Huh... I thought the context size was determined when the model was trained due to the positional encoding used. (I am only a layman.) But https://github.com/ggerganov/llama.cpp/pull/78 is still useful for...

> Can you give a minimal repro scenario? My bad, the repro was in the linked issue. I've updated the issue description. > If this is only happening in Python...

@SuperCipher I just put it on my "projects" list to do down the line when I have some time and I'm not working on something else, since it's not really...

Basically, this fails if I increase n_ctx, beyond the default 512, which I can tell isn't fully supported. I increased the mem_size allocated by ggml by adding to ctx_size, but...

~It still segfaults after 512 tokens.~ EDIT: Hold on, I might be mistaken. I haven't finished converting all the tensors yet.

OK, this works now. Fantastic, thanks for the update!