text-generation-webui
text-generation-webui copied to clipboard
AssertionError
Describe the bug
I got this error since i've updated the repos yesterday. Still not sure if its because i update it or because im doing something else, since i also tried installing "sadtalker" and "tts" on stable diffusion.
Probably some requirement are conflicted on my cache?
Traceback (most recent call last): File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict output = await app.get_blocks().process_api( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api result = await self.call_function( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function prediction = await anyio.to_thread.run_sync( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run result = context.run(func, *args) File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration return next(iterator) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 228, in cai_chatbot_wrapper for history in chatbot_wrapper(text, state): File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 149, in chatbot_wrapper prompt = generate_chat_prompt(text, state, **kwargs) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 42, in generate_chat_prompt while i >= 0 and len(encode(''.join(rows))[0]) < max_length: File "E:\Oobaboga\oobabooga\text-generation-webui\modules\text_generation.py", line 27, in encode input_ids = shared.tokenizer.encode(str(prompt)) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\llamacpp_model_alternative.py", line 37, in encode return self.model.tokenize(string) File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 108, in tokenize assert self.ctx is not None AssertionError
Anyone know what's wrong? i have tried updating it, reinstall bitsandbytes (since people said windows are broken yesterday). still got this same error..
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
- start oobabooga
- successfully open the UI
- when i press "generate" that above error appear on cmd
Screenshot
No response
Logs
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict
output = await app.get_blocks().process_api(
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api
result = await self.call_function(
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function
prediction = await anyio.to_thread.run_sync(
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
result = context.run(func, *args)
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration
return next(iterator)
File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 228, in cai_chatbot_wrapper
for history in chatbot_wrapper(text, state):
File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 149, in chatbot_wrapper
prompt = generate_chat_prompt(text, state, **kwargs)
File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 42, in generate_chat_prompt
while i >= 0 and len(encode(''.join(rows))[0]) < max_length:
File "E:\Oobaboga\oobabooga\text-generation-webui\modules\text_generation.py", line 27, in encode
input_ids = shared.tokenizer.encode(str(prompt))
File "E:\Oobaboga\oobabooga\text-generation-webui\modules\llamacpp_model_alternative.py", line 37, in encode
return self.model.tokenize(string)
File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 108, in tokenize
assert self.ctx is not None
AssertionError
System Info
- Windows
- 8GB VRAM Nvidia Ampere card
im having the same error
Same here as well
I'm getting this with llama-2-70b-chat but not with 7b-chat