Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model comptatibility #994

Open
1 task done
rogerlette opened this issue Jan 23, 2025 · 5 comments
Open
1 task done

Model comptatibility #994

rogerlette opened this issue Jan 23, 2025 · 5 comments
Labels
exports Model exports (ONNX, TensorRT, TFLite, etc.) HUB Ultralytics HUB issues question Further information is requested

Comments

@rogerlette
Copy link

Search before asking

Question

Hello,

  • I'm using YoloDotNet 2.0 to use a model in C#
  • I try YoloDotNet2.0, it's working well with a sample model, rady to use, pre-trained on coco dataset.
  • I exported my own model (yoloV8s) in onnx thanks to ultralytics hub.
  • I does not work with YoloDotNet 2.0
  • I tried set the opset to 17, 15 etc. it does not work
  • It seems that pytorch version may be a problem (version 2.5.1 on my own model, 2.1.0 on the sample model)

How to deploy or convert my own model with pytorch 2.1.0 or make it compatible ? = working with YoloDotNet 2.0 ?

https://www.nuget.org/packages/YoloDotNet/2.2.0#readme-body-tab

Additional

No response

@rogerlette rogerlette added the question Further information is requested label Jan 23, 2025
@UltralyticsAssistant UltralyticsAssistant added exports Model exports (ONNX, TensorRT, TFLite, etc.) HUB Ultralytics HUB issues labels Jan 23, 2025
@UltralyticsAssistant
Copy link
Member

👋 Hello @rogerlette, thank you for raising this issue about Ultralytics HUB 🚀! We’re excited to help you with your YOLO journey. Please visit our HUB Docs to learn more:

It sounds like you're encountering an issue related to model export compatibility with YoloDotNet 2.0. To assist us in understanding the situation better and providing the most helpful guidance, could you kindly provide the following details?

  1. The specific steps you've taken to export your YOLOv8s model in ONNX format via the HUB, including the export settings.
  2. Any error messages or logs that come up when attempting to use your exported ONNX model with YoloDotNet 2.0.
  3. An example ONNX file or a minimum reproducible example (MRE), as outlined in our troubleshooting guide, to help us replicate and investigate the issue.

Please note that models exported via the HUB are generally compatible with common deployment frameworks, but YoloDotNet 2.0 may have specific or additional requirements. If you believe the Pytorch version is contributing to the issue, let us know the full pipeline details so we can verify and guide you accordingly 😊.

An Ultralytics engineer will review this issue and provide further assistance soon. Thank you for your patience and for being part of the Ultralytics community! 🚀

@rogerlette rogerlette changed the title Model comptaibility Model comptatibility Jan 23, 2025
@rogerlette
Copy link
Author

Ok, an AI is assiting me ;)

  1. I used ultralytics hub to export my model, using the tab "deploy", and the button onnx.
  2. Yes : Model not compatible
  3. You can find the models here : model_cccccc.onnx is the one not working, yolov8s.onnx model is the one working :https://we.tl/t-s9mzHrObSL

@pderrenger
Copy link
Member

@rogerlette hello! 😊

Thank you for sharing your observations and the models for reference. Based on your description, it seems like you’re facing compatibility issues between the ONNX model exported from Ultralytics HUB and YoloDotNet 2.0. Let’s break this down and explore potential solutions:

1. Compatibility of ONNX Models

  • The difference in compatibility might arise due to the PyTorch version used during model export or specific ONNX features introduced in the newer versions.
  • YoloDotNet 2.0 may support ONNX models up to a specific opset version, and newer models exported with higher opset versions may cause issues.

2. Steps to Troubleshoot

  • Check Opset Version: When exporting your yolov8s model from Ultralytics HUB, explicitly set the opset to the version supported by YoloDotNet (e.g., opset=12 or opset=15) during the export. This can often resolve compatibility issues.

    from ultralytics import YOLO
    model = YOLO("yolov8s.pt")
    model.export(format="onnx", opset=12)  # Export to ONNX with a specific opset

    If you’re using Ultralytics HUB, ensure you select the appropriate export settings via the Deploy tab.

  • Simplify the ONNX Model: When exporting from the HUB or CLI, enable the simplify flag. This reduces the model complexity and may improve compatibility.

    model.export(format="onnx", simplify=True)
  • Verify Model with ONNX Runtime: Use ONNX Runtime to confirm if the exported model works independently. You can install ONNX Runtime and run a quick inference test:

    import onnxruntime as ort
    ort_session = ort.InferenceSession("model_cccccc.onnx")
    print("Model loaded successfully!")

3. Potential Adjustments in YoloDotNet

  • Check whether YoloDotNet 2.0 supports YOLOv8 models. If it is limited to earlier versions like YOLOv5, you may need to adjust the model architecture or use a different inference library that supports YOLOv8 ONNX models.

4. Next Steps

  • Re-export your model using the steps above and test it with YoloDotNet 2.0.
  • If you’re still encountering issues, it might be worth testing your model with other ONNX inference libraries (e.g., ONNX Runtime) to confirm if the issue lies with the model or YoloDotNet.

Let me know how it goes! I’m here to assist further if needed. 🚀

@rogerlette
Copy link
Author

Hello, is there a human here to answer questions ?

1 . Sure that's what I think, using netron I can see that the working model I sent to you (yolov8s.onnx) is using pytorch 2.1.0 while the model I export with ultralytics hub is using pytoorch 2.5.1

  1. I did tried, and tried again different opset versions. It's not working better. I tried simplfied model too with different opset.
    Everytime I used onnx runtime to check the model, the test is ok everytime.

  2. YoloDotNet is compatible with yoloV8s, I'm sure about it because I have a sample model (the one I sent you called : yolov8s.onnx) which is a YoloV8s model as it name suggest.

  3. Thats exactly what I did.

So it's not working. Thanks for you help.

@pderrenger
Copy link
Member

Hello @rogerlette,

Thank you for the detailed follow-up! I can see you've been quite thorough in your troubleshooting, so let's narrow this down further to try to resolve the issue. Here's a breakdown based on your observations:


Key Observations:

  1. PyTorch Version Difference: The working model (yolov8s.onnx) uses PyTorch 2.1.0, while your HUB-exported model uses PyTorch 2.5.1. This might result in differences in the ONNX graph representation, potentially causing compatibility issues with YoloDotNet.
  2. ONNX Runtime Validation Passes: Since ONNX Runtime successfully loads and validates the HUB-exported model, the issue seems specific to YoloDotNet's compatibility with the exported ONNX graph.
  3. YoloDotNet Compatibility Confirmed: YoloDotNet supports YOLOv8 models, so the issue is likely tied to the differences in export configurations or ONNX graph structure between the two models.

Suggested Steps to Resolve:

1. Force Export with PyTorch 2.1.0:

  • Since the working model (yolov8s.onnx) was likely exported using PyTorch 2.1.0, you can replicate this environment by downgrading PyTorch in a local setup.
  • Use the following commands to set up the environment and re-export the model:
    pip install torch==2.1.0 torchvision --index-url https://download.pytorch.org/whl/cu117
    pip install ultralytics
    Then, export the model:
    from ultralytics import YOLO
    
    model = YOLO("path/to/your_model.pt")
    model.export(format="onnx", opset=12, simplify=True)  # Use opset 12 or 15
  • Test this newly exported model with YoloDotNet.

2. Compare ONNX Graphs:

  • Use Netron to compare the structure of the working model (yolov8s.onnx) with the HUB-exported model (model_cccccc.onnx). Focus on:
    • Operator versions and types.
    • Differences in input/output dimensions or tensor names.
  • If you notice any differences, it could be a clue as to what YoloDotNet is struggling with.

3. Re-export with HUB Options:

  • While exporting through Ultralytics HUB, ensure the following:
    • Set the opset version explicitly (e.g., opset 12 or 15).
    • Enable the "Simplify" option during export to reduce graph complexity.
  • Test the updated ONNX export with YoloDotNet.

4. Debug YoloDotNet:

  • If the export steps above do not resolve the issue, the problem might be specific to how YoloDotNet parses the ONNX model. You could:
    • Check YoloDotNet's documentation or source code for any constraints on ONNX operator versions or graph structures.
    • Share specific error messages or logs from YoloDotNet when loading the HUB-exported model to better understand the failure.

Final Notes:

If the issue persists after trying these steps, I recommend reaching out to the YoloDotNet maintainers to clarify any specific ONNX graph requirements or constraints they might have. The HUB-exported model passes ONNX Runtime validation, which indicates that it adheres to ONNX standards.

Let me know how it goes or if you need further assistance! 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
exports Model exports (ONNX, TensorRT, TFLite, etc.) HUB Ultralytics HUB issues question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants