AGiXT
AGiXT copied to clipboard
multiple calls to text-generation-webui
Description
When starting webui+backend+frontend (in this order) and in web interface "TASK AGENT" sent text "compose a joke about pirates", it's tarting to send to webui many calls. I don't know if it's webui problem, my error, or Agent-LLM problem, but the output in webui is: "
Traceback (most recent call last): File "/home/mihai/text-generation-webui/modules/text_generation.py", line 272, in generate_reply output = shared.model.generate(**generate_params)[0] File "/home/mihai/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/mihai/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 1276, in generate and torch.sum(inputs_tensor[:, -1] == generation_config.pad_token_id) > 0 IndexError: index -1 is out of bounds for dimension 1 with size 0 Output generated in 0.01 seconds (0.00 tokens/s, 0 tokens, context 0, seed 140062149) 127.0.0.1 - - [03/May/2023 20:25:11] "POST /api/v1/generate HTTP/1.1" 200 -
Traceback (most recent call last):
File "/home/mihai/text-generation-webui/modules/text_generation.py", line 272, in generate_reply
output = shared.model.generate(**generate_params)[0]
File "/home/mihai/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/mihai/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 1276, in generate
and torch.sum(inputs_tensor[:, -1] == generation_config.pad_token_id) > 0
IndexError: index -1 is out of bounds for dimension 1 with size 0
Output generated in 0.05 seconds (0.00 tokens/s, 0 tokens, context 0, seed 793370993)
127.0.0.1 - - [03/May/2023 20:25:11] "POST /api/v1/generate HTTP/1.1" 200 -
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /home/mihai/text-generation-webui/server.py:929 in
Steps to Reproduce the Bug
- start text-generation-webui
- start backend
- start frontend
- go to webpage http://localhost:3000/agent/XXX to TASK AGENT
- send text "compose a joke about pirates"
Expected Behavior
to function like in previous versions, i presume like in v1.0.17-alpha
Actual Behavior
It's starting a subsecond comunication with webui, which result is always like this: "Response:
New Tasks:
[{'task_name': ''}]
Response:
Executing task 1: Develop a task list.
Response:
Task Result:
Response:
New Tasks:
[{'task_name': ''}]
Response:
Executing task 1: Develop a task list.
Response:
Task Result: "
Additional Context / Screenshots
No response
Operating System
- [ ] Microsoft Windows
- [ ] Apple MacOS
- [ ] Linux
- [ ] Android
- [ ] iOS
- [X] Other
Python Version
- [ ] Python <= 3.9
- [X] Python 3.10
- [ ] Python 3.11
Environment Type - Connection
- [X] Local
- [ ] Remote
Environment Type - Container
- [ ] Using Docker
- [X] Not Using Docker
Acknowledgements
- [X] My issue title is concise, descriptive, and in title casing.
- [X] I have searched the existing issues to make sure this bug has not been reported yet.
- [X] I am using the latest version of Agent-LLM.
- [X] I have provided enough information for the maintainers to reproduce and diagnose the issue.