You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2025-01-13 17:19:30.473927780 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:965 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.ONNX model tested and results look good![01/13/2025-17:19:40] [TRT] [W] UNSUPPORTED_STATE: Skipping tactic 0 due to insufficient memory on requested size of 1803374592 detected for tactic 0x0000000000000000.[01/13/2025-17:19:40] [TRT] [E] IBuilder::buildSerializedNetwork: Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[/blocks/blocks.10/attn/Gather_2_output_0[Constant].../blocks/blocks.11/Reshape_1 + /Transpose]}.)Traceback (most recent call last): File "/home/mzcar/accelerated/test_script.py", line 74, in <module> engine = backend.prepare(onnx_model, device='CUDA:0') File "/home/mzcar/miniconda3/envs/mzinferx/lib/python3.10/site-packages/onnx_tensorrt/backend.py", line 236, in prepare return TensorRTBackendRep(model, device, **kwargs) File "/home/mzcar/miniconda3/envs/mzinferx/lib/python3.10/site-packages/onnx_tensorrt/backend.py", line 92, in __init__ self._build_engine() File "/home/mzcar/miniconda3/envs/mzinferx/lib/python3.10/site-packages/onnx_tensorrt/backend.py", line 132, in _build_engine raise RuntimeError("Failed to build TensorRT engine from network")RuntimeError: Failed to build TensorRT engine from network
I am able to export the model to ONNX, but converting to TensorRT engine is failing. Could anyone help me out here?
Thanks!
The text was updated successfully, but these errors were encountered:
deo-abhijit
changed the title
Error while creating tensorrt engine for Vision Transformer
Error while creating tensorrt engine for SAM Vision Transformer
Jan 13, 2025
deo-abhijit
changed the title
Error while creating tensorrt engine for SAM Vision Transformer
Error while creating tensorrt engine for SAM Vision Transformer using timm library
Jan 13, 2025
Error Log:
TensorRT Version:
Code to reprodue:
I am able to export the model to ONNX, but converting to TensorRT engine is failing. Could anyone help me out here?
Thanks!
The text was updated successfully, but these errors were encountered: