serve icon indicating copy to clipboard operation
serve copied to clipboard

dense161 example

Open divastar opened this issue 3 years ago • 2 comments

🐛 Describe the bug

Hi the default densnet161 example worked but when I tried to run it with ts/torch_handler/densenet_handler.py it looked like stuck on the request curl http://127.0.0.1:8080/predictions/densenet161 -T examples/image_classifier/kitten.jpg Is this the right way to archive using --handler ts/torch_handler/image_classifier.py instead of the default example: --handler image_classifier ?

torch-model-archiver --model-name densenet161 --version 1.0 --model-file examples/image_classifier/densenet_161/model.py --serialized-file ../densnet/densenet161-8d451a50.pth --handler ts/torch_handler/image_classifier.py --extra-files examples/image_classifier/index_to_name.json

divastar avatar Sep 19 '22 15:09 divastar

@divastar image_classifier handler as shown in the example https://github.com/pytorch/serve/tree/master/examples/image_classifier/densenet_161 is more complete. I'll find out more about ts/torch_handler/densenet_handler and get back to you.

agunapal avatar Sep 19 '22 19:09 agunapal

thank you

divastar avatar Sep 20 '22 09:09 divastar

Please re-open if issue exists

agunapal avatar Sep 26 '22 16:09 agunapal

Hi. This timee after running: wget https://download.pytorch.org/models/densenet161-8d451a50.pth torch-model-archiver --model-name densenet161 --version 1.0 --model-file examples/image_classifier/densenet_161/model.py --serialized-file densenet161-8d451a50.pth --handler image_classifier --extra-files examples/image_classifier/index_to_name.json mkdir model_store mv densenet161.mar model_store/ torchserve --start --model-store model_store --models densenet161=densenet161.mar curl http://127.0.0.1:8080/predictions/densenet161 -T examples/image_classifier/kitten.jpg

I am getting no response and the log saying : ] W-9000-densenet161_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'image_classifier'

2022-10-03T13:14:15,231 [INFO ] pool-3-thread-1 TS_METRICS - GPUMemoryUsed.Megabytes:2486|#Level:Host,device_id:0|#hostname:LAPTOP-24B8FP3F,timestamp:1664792055 2022-10-03T13:14:15,231 [INFO ] pool-3-thread-1 TS_METRICS - GPUUtilization.Percent:1|#Level:Host,device_id:0|#hostname:LAPTOP-24B8FP3F,timestamp:1664792055 2022-10-03T13:14:15,232 [INFO ] pool-3-thread-1 TS_METRICS - MemoryAvailable.Megabytes:9941.36328125|#Level:Host|#hostname:LAPTOP-24B8FP3F,timestamp:1664792055 2022-10-03T13:14:15,232 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUsed.Megabytes:22627.3125|#Level:Host|#hostname:LAPTOP-24B8FP3F,timestamp:1664792055 2022-10-03T13:14:15,233 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUtilization.Percent:69.5|#Level:Host|#hostname:LAPTOP-24B8FP3F,timestamp:1664792055 2022-10-03T13:14:15,505 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Backend worker process died. 2022-10-03T13:14:15,507 [INFO ] nioEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:15,506 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2022-10-03T13:14:15,507 [INFO ] nioEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:15,511 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:15,508 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 100, in load 2022-10-03T13:14:15,511 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:15,511 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module, function_name = self._load_handler_file(handler) 2022-10-03T13:14:15,512 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 162, in load_handler_file 2022-10-03T13:14:15,512 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module = importlib.import_module(module_name) 2022-10-03T13:14:15,513 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\importlib_init.py", line 126, in import_module 2022-10-03T13:14:15,513 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - return _bootstrap._gcd_import(name[level:], package, level) 2022-10-03T13:14:15,514 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "", line 1050, in _gcd_import 2022-10-03T13:14:15,514 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "", line 1027, in _find_and_load 2022-10-03T13:14:15,515 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "", line 1004, in _find_and_load_unlocked 2022-10-03T13:14:15,516 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - ModuleNotFoundError: No module named 'image_classifier' 2022-10-03T13:14:15,517 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - 2022-10-03T13:14:15,517 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - During handling of the above exception, another exception occurred: 2022-10-03T13:14:15,518 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - 2022-10-03T13:14:15,518 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2022-10-03T13:14:15,518 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py", line 210, in 2022-10-03T13:14:15,519 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - worker.run_server() 2022-10-03T13:14:15,519 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py", line 181, in run_server 2022-10-03T13:14:15,520 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket) 2022-10-03T13:14:15,520 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py", line 139, in handle_connection 2022-10-03T13:14:15,521 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg) 2022-10-03T13:14:15,521 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py", line 104, in load_model 2022-10-03T13:14:15,512 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:15,521 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - service = model_loader.load( 2022-10-03T13:14:15,512 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:15,522 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 102, in load 2022-10-03T13:14:15,527 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:15,527 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:15,528 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:15,528 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module = self._load_default_handler(handler) 2022-10-03T13:14:15,528 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:15,529 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:15,528 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 167, in load_default_handler 2022-10-03T13:14:15,529 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:15,530 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:15,530 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module = importlib.import_module(module_name, "ts.torch_handler") 2022-10-03T13:14:15,531 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:15,530 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:15,532 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds. 2022-10-03T13:14:15,531 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:15,531 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\importlib_init.py", line 126, in import_module 2022-10-03T13:14:15,553 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:15,532 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds. 2022-10-03T13:14:15,553 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:16,549 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:16,549 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:17,532 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Listening on port: None 2022-10-03T13:14:17,536 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - [PID]20116 2022-10-03T13:14:17,538 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:17,538 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Torch worker started. 2022-10-03T13:14:17,538 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:17,541 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:17,540 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Python runtime: 3.10.4 2022-10-03T13:14:17,541 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:17,545 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792057545 2022-10-03T13:14:17,545 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792057545 2022-10-03T13:14:17,545 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9000). 2022-10-03T13:14:17,555 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - model_name: densenet161, batchSize: 1 2022-10-03T13:14:17,854 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Backend worker process died. 2022-10-03T13:14:17,855 [INFO ] nioEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:17,855 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2022-10-03T13:14:17,855 [INFO ] nioEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:17,859 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:17,856 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 100, in load 2022-10-03T13:14:17,859 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:17,859 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:17,859 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module, function_name = self._load_handler_file(handler) 2022-10-03T13:14:17,859 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:17,860 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:17,860 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 162, in load_handler_file 2022-10-03T13:14:17,860 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:17,861 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:17,861 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module = importlib.import_module(module_name) 2022-10-03T13:14:17,861 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:17,862 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:17,862 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\importlib_init.py", line 126, in import_module 2022-10-03T13:14:17,862 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:17,863 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:17,863 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - return _bootstrap._gcd_import(name[level:], package, level) 2022-10-03T13:14:17,863 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:17,864 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds. 2022-10-03T13:14:17,864 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:17,864 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "", line 1050, in _gcd_import 2022-10-03T13:14:17,867 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:17,864 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:17,864 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds. 2022-10-03T13:14:17,867 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:18,875 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:18,875 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:19,826 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Listening on port: None 2022-10-03T13:14:19,826 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - [PID]33412 2022-10-03T13:14:19,827 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:19,827 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Torch worker started. 2022-10-03T13:14:19,827 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:19,831 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:19,830 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Python runtime: 3.10.4 2022-10-03T13:14:19,831 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:19,833 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792059833 2022-10-03T13:14:19,833 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792059833 2022-10-03T13:14:19,833 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9000). 2022-10-03T13:14:19,843 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - model_name: densenet161, batchSize: 1 2022-10-03T13:14:20,143 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Backend worker process died. 2022-10-03T13:14:20,143 [INFO ] nioEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:20,143 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2022-10-03T13:14:20,143 [INFO ] nioEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:20,147 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:20,144 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 100, in load 2022-10-03T13:14:20,147 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:20,148 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:20,147 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module, function_name = self._load_handler_file(handler) 2022-10-03T13:14:20,148 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died. java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1679) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:189) [model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:833) [?:?] 2022-10-03T13:14:20,149 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:20,148 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 162, in load_handler_file 2022-10-03T13:14:20,149 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: densenet161, error: Worker died. 2022-10-03T13:14:20,151 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:20,150 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - module = importlib.import_module(module_name) 2022-10-03T13:14:20,151 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2022-10-03T13:14:20,152 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:20,152 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\importlib_init.py", line 126, in import_module 2022-10-03T13:14:20,152 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stderr 2022-10-03T13:14:20,153 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:20,153 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - return _bootstrap._gcd_import(name[level:], package, level) 2022-10-03T13:14:20,153 [WARN ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-densenet161_1.0-stdout 2022-10-03T13:14:20,154 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 2 seconds. 2022-10-03T13:14:20,154 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:20,154 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "", line 1050, in _gcd_import 2022-10-03T13:14:20,157 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:20,154 [INFO ] W-9000-densenet161_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stderr 2022-10-03T13:14:20,154 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 2 seconds. 2022-10-03T13:14:20,157 [INFO ] W-9000-densenet161_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-densenet161_1.0-stdout 2022-10-03T13:14:22,165 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:22,165 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\k\anaconda3\envs\segment\python.exe, C:\Users\k\anaconda3\envs\segment\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000] 2022-10-03T13:14:23,111 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Listening on port: None 2022-10-03T13:14:23,111 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - [PID]27524 2022-10-03T13:14:23,118 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:23,118 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Torch worker started. 2022-10-03T13:14:23,118 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-densenet161_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2022-10-03T13:14:23,120 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:23,120 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Python runtime: 3.10.4 2022-10-03T13:14:23,120 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000 2022-10-03T13:14:23,124 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792063124 2022-10-03T13:14:23,124 [INFO ] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1664792063124 2022-10-03T13:14:23,124 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9000). 2022-10-03T13:14:23,134 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - model_name: densenet161, batchSize: 1 2022-10-03T13:14:23,438 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Backend worker process died. 2022-10-03T13:14:23,438 [INFO ] nioEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:23,438 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2022-10-03T13:14:23,438 [INFO ] nioEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2022-10-03T13:14:23,441 [DEBUG] W-9000-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2022-10-03T13:14:23,439 [INFO ] W-9000-densenet161_1.0-stdout MODEL_LOG - File "C:\Users\k\anaconda3\envs\segment\lib\site-packages\ts\model_loader.py", line 100, in load

divastar avatar Oct 03 '22 10:10 divastar