MLServer icon indicating copy to clipboard operation
MLServer copied to clipboard

Streaming model docker not working

Open Hamlet626 opened this issue 1 year ago • 1 comments

I tried to dockerize a streaming model by following example streaming server, and run with something like "mlserver build streaming_model/ -t 'stream_ml_service", then "docker run -it --rm -p 8080:8080 stream_ml_service". But it raised:

Traceback (most recent call last):
  File "/opt/conda/bin/mlserver", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 269, in main
    root()
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 24, in wrapper
    return asyncio.run(f(*args, **kwargs))
  File "/opt/conda/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
  File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 47, in start
    server = MLServer(settings)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/server.py", line 32, in __init__
    self._metrics_server = MetricsServer(self._settings)
  File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 26, in __init__
    self._app = self._get_app()
  File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 30, in _get_app
    app.add_route(self._settings.metrics_endpoint, self._endpoint.handle_metrics)
  File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 166, in add_route
    self.router.add_route(
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 833, in add_route
    route = Route(
  File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 226, in __init__
    assert path.startswith("/"), "Routed paths must start with '/'"
AssertionError: Routed paths must start with '/'

Does anyone know how to dockerize a streaming model, thanks?

Hamlet626 avatar Sep 14 '24 20:09 Hamlet626

Hi @Hamlet626 I reproduced the issue with the configurations you provided. The issue is because of the metrics_endpoint use the below config for settings.json { "debug": false, "parallel_workers": 0, "gzip_enabled": false, "metrics_endpoint": "/metrics" } after this application started working

shivakrishnaah avatar Jan 20 '25 18:01 shivakrishnaah