core icon indicating copy to clipboard operation
core copied to clipboard

Changes to the Ollama conversation and LLM helper to facilitate sane user defined LLM prompt and expose_entities

Open josh-blake opened this issue 1 year ago • 2 comments

Breaking change

Changes exposed_entities “names” to “aliases”. Removes the join of aliases and areas from llm.py. Allow the user to join this if required in template, otherwise refer to it as a dict.

Proposed change

Removes the added LLM fluff from the prompt supplied to the LLM and instead restricts it to only what is user-supplied in the config-flow. Changes also adds exposed_entities via the LLM APIInstance object and allows for exposed_entities to be referenced in the prompt template. Enables Ollama prompt template parsing.

Type of change

  • [ ] Dependency upgrade
  • [X] Bugfix (non-breaking change which fixes an issue)
  • [ ] New integration (thank you!)
  • [X] New feature (which adds functionality to an existing integration)
  • [ ] Deprecation (breaking change to happen in the future)
  • [ ] Breaking change (fix/feature causing existing functionality to break)
  • [ ] Code quality improvements to existing code or addition of tests

Additional information

  • This PR fixes or closes issue: fixes # 126109
  • This PR is related to issue: 126109
  • Link to documentation pull request:

Checklist

  • [X] The code change is tested and works locally.
  • [X] Local tests pass. Your PR cannot be merged unless tests pass
  • [ ] There is no commented out code in this PR.
  • [X] I have followed the development checklist
  • [X] I have followed the perfect PR recommendations
  • [ ] The code has been formatted using Ruff (ruff format homeassistant tests)
  • [X] Tests have been added to verify that the new code works.

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • [ ] The manifest file has all fields filled out correctly.
    Updated and included derived files by running: python3 -m script.hassfest.
  • [ ] New or updated dependencies have been added to requirements_all.txt.
    Updated by running python3 -m script.gen_requirements_all.
  • [ ] For the updated dependencies - a link to the changelog, or at minimum a diff between library versions is added to the PR description.

To help with the load of incoming pull requests:

josh-blake avatar Sep 18 '24 00:09 josh-blake

Please take a look at the requested changes, and use the Ready for review button when you are done, thanks :+1:

Learn more about our pull request process.

home-assistant[bot] avatar Sep 18 '24 00:09 home-assistant[bot]

Hey there @synesthesiam, mind taking a look at this pull request as it has been labeled with an integration (ollama) you are listed as a code owner for? Thanks!

Code owner commands

Code owners of ollama can trigger bot actions by commenting:

  • @home-assistant close Closes the pull request.
  • @home-assistant rename Awesome new title Renames the pull request.
  • @home-assistant reopen Reopen the pull request.
  • @home-assistant unassign ollama Removes the current integration label and assignees on the pull request, add the integration domain after the command.
  • @home-assistant add-label needs-more-information Add a label (needs-more-information, problem in dependency, problem in custom component) to the pull request.
  • @home-assistant remove-label needs-more-information Remove a label (needs-more-information, problem in dependency, problem in custom component) on the pull request.

home-assistant[bot] avatar Sep 18 '24 00:09 home-assistant[bot]

If people don't want the LLM API prompt to be added, they can unselect allowing their LLM to access the API.

balloob avatar Nov 11 '24 01:11 balloob