Skip to content

v0.3.0 Linux GPU: TensorRT

Latest
Compare
Choose a tag to compare
@DefTruth DefTruth released this 14 Oct 04:56
· 5 commits to main since this release
d4af41c

New Features: NVIDIA GPU Inference support via TensorRT

🎉🎉TensorRT: Boost inference performance with NVIDIA GPU via TensorRT.

Run bash ./build.sh tensorrt to build lite.ai.toolkit with TensorRT support, and then test yolov5 with the codes below. NOTE: lite.ai.toolkit need TensorRT 10.x (or later) and CUDA 12.x (or later). Please check build.sh, tensorrt-linux-x86_64-install.zh.md, test_lite_yolov5.cpp and NVIDIA/TensorRT for more details.

// trtexec --onnx=yolov5s.onnx --saveEngine=yolov5s.engine
auto *yolov5 = new lite::trt::cv::detection::YOLOV5(engine_path);
Class Class Class Class Class System Engine
YOLOv5 YOLOv6 YOLOv8 YOLOv8Face YOLOv5Face Linux TensorRT
YOLOX YOLOv5BlazeFace StableDiffusion / / Linux TensorRT

What's Changed

New Contributors

Full Changelog: v0.2.0...v0.3.0-rc1