text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

AssertionError

Open Wolvensdale opened this issue 1 year ago • 2 comments

Describe the bug

I got this error since i've updated the repos yesterday. Still not sure if its because i update it or because im doing something else, since i also tried installing "sadtalker" and "tts" on stable diffusion.

Probably some requirement are conflicted on my cache?

Traceback (most recent call last): File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict output = await app.get_blocks().process_api( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api result = await self.call_function( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function prediction = await anyio.to_thread.run_sync( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run result = context.run(func, *args) File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration return next(iterator) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 228, in cai_chatbot_wrapper for history in chatbot_wrapper(text, state): File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 149, in chatbot_wrapper prompt = generate_chat_prompt(text, state, **kwargs) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 42, in generate_chat_prompt while i >= 0 and len(encode(''.join(rows))[0]) < max_length: File "E:\Oobaboga\oobabooga\text-generation-webui\modules\text_generation.py", line 27, in encode input_ids = shared.tokenizer.encode(str(prompt)) File "E:\Oobaboga\oobabooga\text-generation-webui\modules\llamacpp_model_alternative.py", line 37, in encode return self.model.tokenize(string) File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 108, in tokenize assert self.ctx is not None AssertionError

Anyone know what's wrong? i have tried updating it, reinstall bitsandbytes (since people said windows are broken yesterday). still got this same error..

Is there an existing issue for this?

  • [X] I have searched the existing issues

Reproduction

  • start oobabooga
  • successfully open the UI
  • when i press "generate" that above error appear on cmd

Screenshot

No response

Logs

File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict
    output = await app.get_blocks().process_api(
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api
    result = await self.call_function(
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration
    return next(iterator)
  File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 228, in cai_chatbot_wrapper
    for history in chatbot_wrapper(text, state):
  File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 149, in chatbot_wrapper
    prompt = generate_chat_prompt(text, state, **kwargs)
  File "E:\Oobaboga\oobabooga\text-generation-webui\modules\chat.py", line 42, in generate_chat_prompt
    while i >= 0 and len(encode(''.join(rows))[0]) < max_length:
  File "E:\Oobaboga\oobabooga\text-generation-webui\modules\text_generation.py", line 27, in encode
    input_ids = shared.tokenizer.encode(str(prompt))
  File "E:\Oobaboga\oobabooga\text-generation-webui\modules\llamacpp_model_alternative.py", line 37, in encode
    return self.model.tokenize(string)
  File "E:\Oobaboga\oobabooga\installer_files\env\lib\site-packages\llama_cpp\llama.py", line 108, in tokenize
    assert self.ctx is not None
AssertionError

System Info

- Windows
- 8GB VRAM Nvidia Ampere card

Wolvensdale avatar Apr 14 '23 13:04 Wolvensdale

im having the same error

KanAvR avatar Apr 23 '23 05:04 KanAvR

Same here as well

theFisher86 avatar May 11 '23 04:05 theFisher86

I'm getting this with llama-2-70b-chat but not with 7b-chat

Voyajer avatar Jul 27 '23 10:07 Voyajer