text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

v3.9.1 is broken for me -- KeyError: 'reasoning_effort'

Open ghnp5 opened this issue 5 months ago • 1 comments

Crashes when trying to complete a chat (using OpenAI API):

21:35:17-558712 INFO     Loading the extension "openai"                         

21:35:19-203107 INFO     OpenAI-compatible API URL:                             

                                                                                

                         http://0.0.0.0:5000                                    

                                                                                

INFO:     172.18.0.2:50808 - "POST /v1/chat/completions HTTP/1.0" 500 Internal Server Error

Exception in ASGI application

Traceback (most recent call last):

  File "/app/portable_env/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi

    result = await app(  # type: ignore[func-returns-value]

             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/portable_env/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__

    return await self.app(scope, receive, send)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/portable_env/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__

    await super().__call__(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__

    await self.middleware_stack(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__

    raise exc

  File "/app/portable_env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__

    await self.app(scope, receive, _send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__

    await self.app(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__

    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app

    raise exc

  File "/app/portable_env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app

    await app(scope, receive, sender)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__

    await self.middleware_stack(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/routing.py", line 735, in app

    await route.handle(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle

    await self.app(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/routing.py", line 76, in app

    await wrap_app_handling_exceptions(app, request)(scope, receive, send)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app

    raise exc

  File "/app/portable_env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app

    await app(scope, receive, sender)

  File "/app/portable_env/lib/python3.11/site-packages/starlette/routing.py", line 73, in app

    response = await f(request)

               ^^^^^^^^^^^^^^^^

  File "/app/portable_env/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app

    raw_response = await run_endpoint_function(

                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/portable_env/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function

    return await dependant.call(**values)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/user_data/extensions/openai/script.py", line 139, in openai_chat_completions

    response = OAIcompletions.chat_completions(to_dict(request_data), is_legacy=is_legacy)

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/extensions/openai/completions.py", line 487, in chat_completions

    return deque(generator, maxlen=1).pop()

           ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/extensions/openai/completions.py", line 239, in chat_completions_common

    prompt = generate_chat_prompt(user_input, generate_params, _continue=continue_)

             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/modules/chat.py", line 179, in generate_chat_prompt

    reasoning_effort=state['reasoning_effort']

                     ~~~~~^^^^^^^^^^^^^^^^^^^^

KeyError: 'reasoning_effort'

Parameters:

--listen --nowebui --api --extensions api --model gemma-3-12b-it-qat-UD-Q6_K_XL --loader llama.cpp --ctx-size 32768 --threads 10 --threads-batch 10 --batch-size 256 --streaming-llm

Using Docker, Portable Install.

v3.8 is fine (except that I had to install extra stuff, as described in #7175)

ghnp5 avatar Aug 11 '25 21:08 ghnp5

ran into this with API and exl2 model. on v 3.16 ...seems to work after I do this:

# modules/chat.py  L113-115
enable_thinking=state['enable_thinking'] if 'enable_thinking' in state else None,     
reasoning_effort=state['reasoning_effort'] if 'reasoning_effort' in state else None,  

and

# modules/text_generation.py L353-354
   if 'static_cache' in state and state['static_cache']:      
       generate_params['cache_implementation'] = 'static'     

bars0um avatar Oct 23 '25 18:10 bars0um