OpenLLM icon indicating copy to clipboard operation
OpenLLM copied to clipboard

bug: environment not as string

Open sun7390 opened this issue 2 years ago • 6 comments

Describe the bug

what happened? Error caught while starting LLM Server: environment can only contain strings

To reproduce

No response

Logs

Traceback (most recent call last):
  File "D:\conda\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "D:\conda\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "D:\conda\Scripts\openllm.exe\__main__.py", line 7, in <module>
    sys.exit(cli())
  File "D:\conda\lib\site-packages\click\core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
    rv = self.invoke(ctx)
  File "D:\conda\lib\site-packages\click\core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "D:\conda\lib\site-packages\click\core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "D:\conda\lib\site-packages\click\core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "D:\conda\lib\site-packages\click\core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "D:\conda\lib\site-packages\openllm\cli.py", line 381, in wrapper
    return func(*args, **attrs)
  File "D:\conda\lib\site-packages\openllm\cli.py", line 354, in wrapper
    return_value = func(*args, **attrs)
  File "D:\conda\lib\site-packages\openllm\cli.py", line 329, in wrapper
    return f(*args, **attrs)
  File "D:\conda\lib\site-packages\click\decorators.py", line 26, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "D:\conda\lib\site-packages\openllm\cli.py", line 837, in model_start
    server.start(env=start_env, text=True, blocking=True)
  File "D:\conda\lib\site-packages\bentoml\server.py", line 190, in start
    return _Manager()
  File "D:\conda\lib\site-packages\bentoml\server.py", line 163, in __init__
    self.process = subprocess.Popen(
  File "D:\conda\lib\subprocess.py", line 971, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "D:\conda\lib\subprocess.py", line 1440, in _execute_child
    hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
TypeError: environment can only contain strings

Environment

1

System information (Optional)

GPU 4080

sun7390 avatar Jun 24 '23 03:06 sun7390

what models are you using?

aarnphm avatar Jun 24 '23 03:06 aarnphm

all models,

what models are you using?

when run 'openllm start opt' command,the error occurs

sun7390 avatar Jun 24 '23 04:06 sun7390

Can you send the whole trace of openllm start opt --debug?

aarnphm avatar Jun 24 '23 07:06 aarnphm

Enabling debug mode for current BentoML session
Error caught while starting LLM Server:
environment can only contain strings
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\b\anaconda3\envs\ol\Scripts\openllm.exe\__main__.py", line 7, in <module>
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 1055, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\core.py", line 760, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\openllm\cli.py", line 381, in wrapper
    return func(*args, **attrs)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\openllm\cli.py", line 354, in wrapper
    return_value = func(*args, **attrs)
                   ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\openllm\cli.py", line 329, in wrapper
    return f(*args, **attrs)
           ^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\click\decorators.py", line 26, in new_func
    return f(get_current_context(), *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\openllm\cli.py", line 837, in model_start
    server.start(env=start_env, text=True, blocking=True)
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\bentoml\server.py", line 190, in start
    return _Manager()
           ^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\site-packages\bentoml\server.py", line 163, in __init__
    self.process = subprocess.Popen(
                   ^^^^^^^^^^^^^^^^^
  File "C:\Users\b\anaconda3\envs\ol\Lib\subprocess.py", line 1024, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "C:\Users\b\anaconda3\envs\ol\Lib\subprocess.py", line 1509, in _execute_child
    hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: environment can only contain strings

byteshadow avatar Jun 24 '23 17:06 byteshadow

can you try OPENLLMDEVDEBUG=3 openllm start opt?

aarnphm avatar Jun 25 '23 00:06 aarnphm

Hey there, https://github.com/bentoml/OpenLLM/pull/68 will address this isuse. I will do a patch release promptly

aarnphm avatar Jun 25 '23 07:06 aarnphm

This has been patched in 0.1.14

aarnphm avatar Jun 26 '23 02:06 aarnphm