llama-index-python
llama-index-python copied to clipboard
Increase llm_max_tokens value
Purpose
- LLM responses were being truncated for common prompt responses, so increasing the value from 100 to 800 tokens to reduce the chance of this happening
Does this introduce a breaking change?
[ ] Yes
[x] No
Pull Request Type
What kind of change does this Pull Request introduce?
[x] Bugfix
[ ] Feature
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] Documentation content changes
[ ] Other... Please describe:
How to Test
- Get the code
git clone [repo-address]
cd [repo-name]
git checkout [branch-name]
npm install
- Test the code
What to Check
Verify that the following are valid
- ...