[Bug]: Unable to package llama-index-llms-groq in requirements.txt
Bug Description
Hello,
I added llama-index-llms-groq to my requirements.txt file and im doing sam build to deploy to AWS but it looks likes the command runs indefinitely trying to package the build.
Version
latest
Steps to Reproduce
add llama-index-llms-groq to requirements.txt
do sam build
Relevant Logs/Tracbacks
I'm not really sure how to even fix this? I can pip install llama-index-llms-groq just fine locally
Is it maybe related to overall size of your dependencies?
@logan-markewich is it a dependancy issue?
When i have:
llama-index
llama-index-llms-groq
Its stuck on: Running PythonPipBuilder:ResolveDependencies
But when I do :
llama-index==0.11.0
llama-index-llms-groq==0.2.0
it seems to be going past this stage now, I want to use latest versions though. what is compatible?
Seems like build is being created with:
llama-index==0.11.0 llama-index-llms-groq==0.2.0
But can you please let me know latest compatible versions?
Another issue is that:
llama-index==0.11.0 llama-index-llms-groq==0.2.0
having both of these, the build size is too because of all of the dependancy packages as well so it doesn't even deploy.
Hi, @AyushParikh. I'm Dosu, and I'm helping the LlamaIndex team manage their backlog. I'm marking this issue as stale.
Issue Summary
- You encountered an issue where adding
llama-index-llms-groqtorequirements.txtcausessam buildto hang indefinitely during AWS deployment. - Specifying older versions (
llama-index==0.11.0andllama-index-llms-groq==0.2.0) allows the build to proceed. - The build size remains too large due to dependencies, preventing deployment.
- Logan-markewich suggested the issue might be related to the overall size of dependencies.
Next Steps
- Please confirm if this issue is still relevant with the latest version of the LlamaIndex repository. If it is, feel free to comment to keep the discussion open.
- If there is no further activity, this issue will be automatically closed in 7 days.
Thank you for your understanding and contribution!