Sliced Prediction question: How to prevent detection in bigger area than slice #1081
Unanswered
Makarand-F4F
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I am very new to Sahi and have just recently started using it with a trained detectron-2 model. I have specified the sliced predictions and the model as below. The saved model that I am specifying was trained on 200 x 200 pixel images and I am using the same slice_height and slice_width for prediction.
Now, when I do the predictions, I expect the model to perform the object detection on each of the slice and then combine the results together but sometimes, I see a prediction that spans across multiple slices and using a much bigger tile. For examples, see the images below. First image shows correct small rectangles detected in each 200 x 200 pixel slice.
In this example image, you can see my detections as expected
In this image, you can see a much larger bounding box around an object that has been detected across much larger area than 200 x 200. How do I prevent this?
Beta Was this translation helpful? Give feedback.
All reactions