Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何以小尺寸进行推理?而不是1024 #107

Open
juntaosun opened this issue Oct 16, 2024 · 6 comments
Open

如何以小尺寸进行推理?而不是1024 #107

juntaosun opened this issue Oct 16, 2024 · 6 comments
Labels
help wanted Extra attention is needed

Comments

@juntaosun
Copy link

BiRefNet_lite 默认的输入尺寸是 1024x1024。
如果推理小尺寸的图片(低于1024),它会缩放到该 1024 尺寸。

问题:
该模型支持输入小尺寸比如 768x768 或者 512x512 进行推理吗?如何修改
改小会不会提高一些速度呢,谢谢~

@ZhengPeng7
Copy link
Owner

嗯, 当然支持, 修改在inference那里的data loader的resize预处理就可以啦.

@juntaosun
Copy link
Author

@ZhengPeng7 我用 BiRefNet_pth2onnx.ipynb 成功导出了 onnx。

遇到了新的问题:
(1)虽然onnx添加了动态输入;但在推理时,尺寸仍被固定,模型无法适应动态输入。


input = torch.randn(1, 3, 512, 512).to(device)

torch.onnx.export(
            net,
            (input),
            file_name,
            verbose=False,
            opset_version=17,
            input_names=['input_image'],
            output_names=['output_image'],
            dynamic_axes={  # 导出时设置动态尺寸dynamic_axes
                        "input_image":   {0: 'b', 1: '3',  2: "h", 3: "w"},  
                        "output_image":  {0: 'b', 1: '1',  2: "h", 3: "w"},
                        },

在用其它尺寸推理时,就会报错必需是固定尺寸512(动态尺寸dynamic_axes不起作用)。
[E:onnxruntime:, sequential_executor.cc:516 onnxruntime::ExecuteKernel] Non-zero status code returned while running Split node. Name:'/decoder/Split_64' Status Message: Cannot split using values in 'split' attribute. Axis=-1 Input shape={3,1024,1024} NumOutputs=1 Num entries in 'split' (must equal number of outputs) was 1 Sum of sizes in 'split' (must equal size of selected axis) was 512

希望能修复一下, 支持onnx动态尺寸输入,谢谢~

@ZhengPeng7
Copy link
Owner

感谢指出和尝试! 但是我对ONNX其实之前也不熟, 这个我刚刚收了似乎做法就是想你说的这样, 看error似乎是torch.split()这个函数的问题. 我后面有时间尽量去修复下哈, 但是请暂时不要抱希望... 如果有进展我会@回复你的哈.

@minushuang
Copy link

嗯, 当然支持, 修改在inference那里的data loader的resize预处理就可以啦.

Hello @ZhengPeng7 ,must the inference input size be the same as the training input size? Can they be different?
eg. 1600x1200 for inference when training is 1024x1024

@ZhengPeng7
Copy link
Owner

Sure. But if you use ONNX, there might be some restrictions on it when exporting the ONNX models.

@alfausa1
Copy link

Hi,
I was wondering if there are any updates on this. I’m particularly interested in support for dynamic_axes when converting to ONNX. Looking forward to any news—thanks in advance!

@ZhengPeng7 ZhengPeng7 added the help wanted Extra attention is needed label Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants