Skip to content

lbq779660843/BiRefNet-Tensorrt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BiRefNet TensorRT

中文

python cuda trt mit

The cpp inference of BiRefNet based on Tensorrt.

Source

DichotomousImage-Gray

DichotomousImage-Pseudo

📑 Records

  • 2024-08-27: Add BiRefNet TensorRT Version.

⏱Performance

The inference time includes the pre-preprocessing and post-processing stages:

Device(System) Model Model Input (WxH) Image Resolution (WxH) Inference Time(ms)
RTX-3080(Windows11) BiRefNet-general-bb_swin_v1_tiny-epoch_232.pth 1920x1080 (1920x2)x1080 130
RTX-A5500(Ubuntu) BiRefNet-general-bb_swin_v1_tiny-epoch_232.pth 3577x2163 (3577x2)x2163 120

🛠️ Libraries

  1. Install TensorRT using TensorRT official guidance.

    Click here for Windows guide
    1. Download the TensorRT zip file that matches the Windows version you are using.
    2. Choose where you want to install TensorRT. The zip file will install everything into a subdirectory called TensorRT-10.x.x.x. This new subdirectory will be referred to as <installpath> in the steps below.
    3. Unzip the TensorRT-10.x.x.x.Windows10.x86_64.cuda-x.x.zip file to the location that you chose. Where:
    • 10.x.x.x is your TensorRT version
    • cuda-x.x is CUDA version 12.4, 11.8 or 12.0
    1. Add the TensorRT library files to your system PATH. To do so, copy the DLL files from <installpath>/lib to your CUDA installation directory, for example, C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\vX.Y\bin, where vX.Y is your CUDA version. The CUDA installer should have already added the CUDA path to your system PATH.

    Click here for installing tensorrt on Linux.

  2. Download and install any recent OpenCV for Windows.

  3. Modify TensorRT and OpenCV paths in CMakelists.txt:

    # Find and include OpenCV
    set(OpenCV_DIR "your path to OpenCV")
    find_package(OpenCV REQUIRED)
    include_directories(${OpenCV_INCLUDE_DIRS})
    
    # Set TensorRT path if not set in environment variables
    set(TENSORRT_DIR "your path to TensorRT")
    
  4. Build project by using the following commands or cmake-gui(Windows).

    1. Windows:
     mkdir build
    cd build
    cmake ..
    cmake --build . --config Release
    1. Linux(not tested):
    mkdir build
    cd build && mkdir out_dir
    cmake ..
    make
  5. Finally, copy the opencv dll files such as opencv_world490.dll and opencv_videoio_ffmpeg490_64.dll into the <BiRefNet_install_path>/build/Release folder.

🤖 Model Preparation

Perform the following steps to create an onnx model:

  1. Download the pretrained model and install BiRefNet:

    git clone https://github.com/ZhengPeng7/BiRefNet.git
    cd BiRefNet
    
    # create a new conda enviroment
    conda create -n BiRefNet python=3.8
    conda activate BiRefNet
    pip install torch torchvision
    pip install opencv-python
    pip install onnx
    
    pip install -r requirements.txt
    
    # copy model and converted files on the root path of BiRefNet
    cp path_to_BiRefNet-general-bb_swin_v1_tiny-epoch_232.pth . 
    cp cpp/py pth2onnx.py .
    cp cpp/py deform_conv2d_onnx_exporter.py .
  2. Export the model to onnx format using pth2onnx.py.

    python pth2onnx.py

Tip

You can modify the size of input and output images, such as 512*512.

🚀 Quick Start

C++

  • Step 1: Create an engine from an onnx model and save it:
trtexec --onnx=BiRefNet-general-bb_swin_v1_tiny-epoch_232.onnx --saveEngine=BiRefNet-general-bb_swin_v1_tiny-epoch_232.engine

Note

If you want to accelerate the inference, you could add fp16 while quantifying the model.

  • Step 2: Deserialize an engine. Once you've built your engine, the next time you run it, simply use your engine file:
BiRefNet.exe <engine> <input image or video>

Example:

# infer image
BiRefNet.exe BiRefNet-general-bb_swin_v1_tiny-epoch_232.engine test.jpg
# infer folder(images)
BiRefNet.exe BiRefNet-general-bb_swin_v1_tiny-epoch_232.engine data
# infer video
BiRefNet.exe BiRefNet-general-bb_swin_v1_tiny-epoch_232.engine test.mp4 

👏 Acknowledgement

This project is based on the following projects: