GPU nvidia #15770
Replies: 10 comments 10 replies
-
The docs tell you how to include models that are downloaded. Have you tried this? https://docs.frigate.video/configuration/object_detectors#generate-models |
Beta Was this translation helpful? Give feedback.
-
according to the documentation, I do not understand how to register in the configuration working with nvidia like this? Or is it? everything is very blurry and unclear, there are no simple examples. |
Beta Was this translation helpful? Give feedback.
-
detectors: 2025-01-02 09:14:55.336672830 [INFO] Preparing Frigate... |
Beta Was this translation helpful? Give feedback.
-
"The model used for TensorRT must be preprocessed on the same hardware platform that they will run on. This means that each user must run additional setup to generate a model file for the TensorRT library. A script is included that will build several common models." it says here that you need to generate models on the same device and that there is a script that can do this, but there is no example of how to do this or what models are needed in what format for the script? Сan you give me an example of the configuration? |
Beta Was this translation helpful? Give feedback.
-
I think I'm very stupid... it doesn't work. so
I think I just don't understand the documentation. maybe you can tell me specifically if I just give a link to the documentation, it doesn't help me solve the problem. I've read it 200 times already. I need a concrete example. |
Beta Was this translation helpful? Give feedback.
-
I just stopped at the step of how to make detection work on the GPU, everything else works well. My detection also works well on the CPU, but it doesn't work on the GPU. I bought a video card, but now I regret that nvidia and not intel, since on intel this is done only by prescribing but for nvidia, as I understand it, this does not work. And I'm very disappointed that I have to do a lot of things that I don't understand in order to make the detector work on nvidia. |
Beta Was this translation helpful? Give feedback.
-
before I can provide the error log, I need to understand how to correctly register the detector in the configuration. So far, all I understand is that I need to add these |
Beta Was this translation helpful? Give feedback.
-
judging by the log, the yolov7-320.trt model cannot be found' |
Beta Was this translation helpful? Give feedback.
-
I couldn't figure out the problem, I thought the problem was in the configuration file, and it didn't occur to me that I needed to specify the model name when starting docker, then it would automatically generate everything I needed. Thank you. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
I need help on how to make the detector work on an nvidia gpu. Tensor should be used everywhere in the documentation, but it requires models, and I've been struggling for 4 days to make models, and they still don't work. How to make frigate work on nvidia? or is it possible to download ready-made working models for working with nvidia somewhere?
Beta Was this translation helpful? Give feedback.
All reactions