ontogpt icon indicating copy to clipboard operation
ontogpt copied to clipboard

Requesting model `orca-2-7b` raises `llm.UnknownModelError`

Open caufieldjh opened this issue 4 months ago • 5 comments

Issue c/o Bart Kleijngeld:

Using the llm package to call orca-2-7b directly works as expected:

$ llm -m orca-2-7b "Hi!"
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.83G/3.83G [05:58<00:00, 10.7MiB/s]
Hello! How can I help you today?

But attempting an extraction with ontogpt fails:

$ ontogpt extract -t drug -i example.txt -m ORCA_2_7B
Traceback (most recent call last):
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/llm/__init__.py", line 148, in get_model
    return aliases[name]
KeyError: 'orca-2-7b'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/harry/ontogpt/.venv/bin/ontogpt", line 6, in <module>
    sys.exit(main())
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/harry/ontogpt/src/ontogpt/cli.py", line 324, in extract
    ke = SPIRESEngine(
  File "<string>", line 23, in __init__
  File "/home/harry/ontogpt/src/ontogpt/engines/knowledge_engine.py", line 184, in __post_init__
    self.set_up_client(model_source=self.model_source)
  File "/home/harry/ontogpt/src/ontogpt/engines/knowledge_engine.py", line 603, in set_up_client
    self.client = GPT4AllClient(model=self.model)
  File "<string>", line 8, in __init__
  File "/home/harry/ontogpt/src/ontogpt/clients/gpt4all_client.py", line 37, in __post_init__
    self.local_model = llm.get_model(self.model)
  File "/home/harry/ontogpt/.venv/lib/python3.10/site-packages/llm/__init__.py", line 150, in get_model
    raise UnknownModelError("Unknown model: " + name)
llm.UnknownModelError: 'Unknown model: orca-2-7b'

Same version of llm in both cases:

$ llm --version
llm, version 0.12
$ poetry show | grep llm
llm                               0.12            A CLI utility and Python ...
llm-gpt4all                       0.2             Plugin for LLM adding sup...

caufieldjh avatar Feb 26 '24 21:02 caufieldjh