llama_index
llama_index copied to clipboard
[Bug]: Code in Guidance Pydantic Program doc not working
Bug Description
Simply following the code in https://docs.llamaindex.ai/en/stable/examples/output_parsing/guidance_pydantic_program/ would yield
ModuleNotFoundError: No module named 'guidance.llms'
Version
llama-index = "^0.10.35"
Steps to Reproduce
Follow the doc in their project.
Relevant Logs/Tracbacks
No response
Hey there, @ubervincent! 👋 I'm Dosu, a friendly bot here to lend a hand with bugs, answer your questions, and guide you on your way to becoming a contributor. I'm like the digital helper you never knew you needed while you wait for a human maintainer to swing by. I'll be diving into your issue about the guidance in the Pydantic Program doc and will get back to you with a solution shortly. Hang tight! 🤖🛠
Likely the guidance package has updated.
Very open to contributions to update the notebook
To address the issue with the ModuleNotFoundError
for guidance.llms
and update the documentation accordingly, you'll need to make a few changes reflecting the latest package structure. The GuidancePydanticProgram
class has been moved and is now accessible under llama_index.legacy.program.guidance_program
. When updating the documentation, ensure it instructs users to import from this new location.
Additionally, the documentation should explain that GuidancePydanticProgram
is designed for use with guidance LLMs, detailing how to instantiate the class with an output class of type BaseModel
, a prompt template string, and an optional guidance LLM instance. It's also crucial to mention the requirement for the guidance
package, guiding users to install it if they encounter an ImportError.
For contributing these updates, refer to the CONTRIBUTING.md
file in the LlamaIndex repository. This involves forking the repository, making your changes in the fork, and then submitting a pull request with your updates.
For more detailed guidance, including example code and further instructions on contributing, you can check the documentation and contribution guidelines directly in the repository.
Details
Sources
- [Bug]: Failing Structured Output Extraction with Guidance Pydantic Program
- [Bug]: No module named 'llama_index.core.llms.llama_utils'
- llama-index-integrations/program/llama-index-program-guidance/llama_index/program/guidance/init.py
- llama-index-legacy/llama_index/legacy/program/guidance_program.py
- llama-index-integrations/program/llama-index-program-guidance/llama_index/program/guidance/base.py
- docs/docs/CONTRIBUTING.md
- llama-index-legacy/CONTRIBUTING.md
- CONTRIBUTING.md
- llama-index-core/CONTRIBUTING.md
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
Seems like llama-index-program-guidance
needs updating on PyPi. I was able to get it working by copying the content of llama-index-integrations/program/llama-index-program-guidance/llama_index/program/guidance/base.py
to my working area as a workaround. In OP's case, this is /usr/local/lib/python3.10/dist-packages/llama_index/program/guidance/base.py
I am now encountering #9914 as an issue.