promptulate
promptulate copied to clipboard
Add retry mechanism to LLMFactory and BaseLLM
Related to #651
Add retry mechanism to LLMFactory
and LiteLLM
classes to increase robustness.
-
LLMFactory:
- Add
max_retry
parameter toLLMFactory.build
method inpromptulate/llms/factory.py
. - Pass
max_retry
parameter toLiteLLM
constructor.
- Add
-
LiteLLM:
- Add
max_retry
attribute toLiteLLM
class inpromptulate/llms/_litellm.py
. - Modify
_predict
method to include retry logic.
- Add
-
AIChat:
- Add
max_retry
parameter toAIChat
constructor inpromptulate/chat.py
. - Pass
max_retry
parameter toLLMFactory.build
method.
- Add
-
Tests:
- Add tests in
tests/llms/test_factory.py
to verify retry mechanism. - Mock
LiteLLM
to simulate failure and retry behavior.
- Add tests in
For more details, open the Copilot Workspace session.
Summary by CodeRabbit
-
New Features
- Introduced a
max_retry
parameter across various functionalities, enhancing user control over retry attempts during LLM initialization and chat operations. - Improved error handling and robustness in the
LiteLLM
prediction process with a retry mechanism.
- Introduced a
-
Bug Fixes
- Enhanced resilience in LLM interactions by implementing a retry logic for failed attempts.
-
Tests
- Added new tests to validate the retry mechanisms for the
LiteLLM
model, ensuring robust error handling.
- Added new tests to validate the retry mechanisms for the