parserllm
parserllm copied to clipboard
Use context-free grammars with an LLM
Tried `examples/example.py` with a tokenizer derived from a `dict[str, int]`: ```python from tokenizers import Tokenizer from tokenizers.models import WordLevel tokenizer = WordLevel(Tokenizer(str_to_int_dict)) tokenizer.eos_token_id = '\n' ``` Stack trace: ```shell Traceback...
Minimum example: ```python grammar = '''start: expression expression: expression " " expression | terminal terminal: "token1" | "token2" '''.strip() parser = Lark(grammar, parser='lalr') p = ParserState(parser) ``` Now, this works...
Hey I was trying to run quantized models using AutoGPTQ or EXLLAMA but it did not work at all. So I am asking if you are thinking about creating a...
I didn't want to saddle [ReLLM](https://github.com/r2d4/rellm ) with the Lark dependency and didn't want to deal with monorepo issues. Any suggestions?