-
First of all, thank you so much for this wonderful library. As I followed the example at How would it be possible to convert unimplemented models to onnx? I saw someone converted it in the discussion #160 (comment) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @NMZ0429 , Thanks for adding this ticket. We've added the support for yolov5x and yolov5x6 in #261 . I haven't added its definition in And you can now use the following ways to load from yolort.models import YOLOv5
# 'yolov5x.pt' is downloaded from https://github.com/ultralytics/yolov5/releases/download/v6.0/yolov5x6.pt
ckpt_path_from_ultralytics = "yolov5x6.pt"
model = YOLOv5.load_from_yolov5(ckpt_path_from_ultralytics, score_thresh=0.25)
model.eval()
img_path = "test/assets/bus.jpg"
predictions = model.predict(img_path) Besides, we provide a CLI tool to export the ONNX model trained from ultralytics/yolov5 in https://github.com/zhiqwang/yolov5-rt-stack/blob/main/tools/export_model.py, I guess you could use the following script to export the ONNX model of
|
Beta Was this translation helpful? Give feedback.
Hi @NMZ0429 , Thanks for adding this ticket.
We've added the support for yolov5x and yolov5x6 in #261 . I haven't added its definition in
models/__init__.py
since we haven't uploaded the translated model yet. (We'll upload the translated PyTorch checkpoint at this week.) But you could find these inmodels/yolo.py
:https://github.com/zhiqwang/yolov5-rt-stack/blob/3485ea144fec7b2857d0d2e0d4ff329959e77027/yolort/models/yolo.py#L31-L32
And you can now use the following ways to load
yolov5x6
innotebooks/export-onnx-inference-onnxruntime.ipynb
: