Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support on Jetson? #67

Open
Mayuresh999 opened this issue Oct 18, 2024 · 1 comment
Open

Support on Jetson? #67

Mayuresh999 opened this issue Oct 18, 2024 · 1 comment

Comments

@Mayuresh999
Copy link

hey @cyrusbehr can you guide me with running yolov8 inference with tensorrt cpp on jetson?.
I have a Jetson Orin NX 16 GB with following config:
Jetpack - 5.1.4
CUDA - 11.4.315
cuDNN - 8.6.0.166
TensorRT - 8.5.2.2
OpenCV - 4.6.0 with CUDA

@rpribau
Copy link

rpribau commented Dec 5, 2024

Hi, the problem maybe is the TensorRT version you´re running. I had the same issues running the code since most of the code is written on the latest TensorRT version and specially when creating the .engine file. If I remember correctly, the main error when running it is on the engine.h file of the TensorRT CPP Library. And if I remember correctly is this line in particular:

else if (tensorDataType == nvinfer1::DataType::kFP8) {
                throw std::runtime_error("Error, model has unsupported output type");
            }

Since TensorRT received kFP8 support until 8.6.x, I suggest to you to update your Jetpack to 6.0. If you're having issues, I'm willing to create an edit of this code and maybe be able to make it work and also since I'm currently working in this library on my own for my own project but I really don't promise a lot to have it done in a week or so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants