Untitled
unknown
plain_text
6 months ago
5.8 kB
10
Indexable
2024-10-25T07:03:53,563 [DEBUG] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mirnet_tensorrt_1.0 State change WORKER_STOPPED -> WORKER_STARTED 2024-10-25T07:03:53,564 [INFO ] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /home/model-server/tmp/.ts.sock.9000 2024-10-25T07:03:53,565 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - Connection accepted: /home/model-server/tmp/.ts.sock.9000. 2024-10-25T07:03:53,565 [DEBUG] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req.cmd LOAD repeats 1 to backend at: 1729839833565 2024-10-25T07:03:53,565 [INFO ] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Looping backend response at: 1729839833565 2024-10-25T07:03:53,572 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - model_name: mirnet_tensorrt, batchSize: 1 2024-10-25T07:03:54,087 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - Enabled tensor cores 2024-10-25T07:03:54,087 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - OpenVINO is not enabled 2024-10-25T07:03:54,087 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - proceeding without onnxruntime 2024-10-25T07:03:54,087 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - Torch TensorRT not enabled 2024-10-25T07:03:57,465 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - [10/25/2024-07:03:57] [TRT] [E] IRuntime::deserializeCudaEngine: Error Code 1: Serialization (Serialization assertion plan->header.pad == expectedPlatformTag failed.Platform specific tag mismatch detected. TensorRT plan files are only supported on the target runtime platform they were created on.) 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - Backend worker process died. 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - Traceback (most recent call last): 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/venv/lib/python3.9/site-packages/ts/model_service_worker.py", line 301, in <module> 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - worker.run_server() 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/venv/lib/python3.9/site-packages/ts/model_service_worker.py", line 268, in run_server 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket) 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/venv/lib/python3.9/site-packages/ts/model_service_worker.py", line 196, in handle_connection 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg) 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/venv/lib/python3.9/site-packages/ts/model_service_worker.py", line 133, in load_model 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - service = model_loader.load( 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/venv/lib/python3.9/site-packages/ts/model_loader.py", line 143, in load 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - initialize_fn(service.context) 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - File "/home/model-server/tmp/models/72395cf309a44c1485d61404a9b2a673/custom_handler.py", line 26, in initialize 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - self.context = self.engine.create_execution_context() 2024-10-25T07:03:57,467 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout MODEL_LOG - AttributeError: 'NoneType' object has no attribute 'create_execution_context' 2024-10-25T07:03:57,467 [INFO ] epollEventLoopGroup-5-10 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED 2024-10-25T07:03:57,467 [DEBUG] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED 2024-10-25T07:03:57,467 [DEBUG] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died., startupTimeout:120sec java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1681) ~[?:?] at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:435) ~[?:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:234) [model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.lang.Thread.run(Thread.java:840) [?:?] 2024-10-25T07:03:57,467 [WARN ] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: mirnet_tensorrt, error: Worker died. 2024-10-25T07:03:57,467 [DEBUG] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-mirnet_tensorrt_1.0 State change WORKER_STARTED -> WORKER_STOPPED 2024-10-25T07:03:57,467 [WARN ] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Auto recovery failed again 2024-10-25T07:03:57,468 [INFO ] W-9000-mirnet_tensorrt_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 55 seconds. 2024-10-25T07:03:57,511 [INFO ] W-9000-mirnet_tensorrt_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-mirnet_tensorrt_1.0-stdout 2024-10-25T07:03:57,511 [INFO ] W-9000-mirnet_tensorrt_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-mirnet_tensorrt_1.0-stderr
Editor is loading...
Leave a Comment