-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
如何以小尺寸进行推理?而不是1024 #107
Comments
嗯, 当然支持, 修改在inference那里的data loader的resize预处理就可以啦. |
@ZhengPeng7 我用 BiRefNet_pth2onnx.ipynb 成功导出了 onnx。 遇到了新的问题:
在用其它尺寸推理时,就会报错必需是固定尺寸512(动态尺寸dynamic_axes不起作用)。 希望能修复一下, 支持onnx动态尺寸输入,谢谢~ |
感谢指出和尝试! 但是我对ONNX其实之前也不熟, 这个我刚刚收了似乎做法就是想你说的这样, 看error似乎是 |
Hello @ZhengPeng7 ,must the inference input size be the same as the training input size? Can they be different? |
Sure. But if you use ONNX, there might be some restrictions on it when exporting the ONNX models. |
Hi, |
BiRefNet_lite 默认的输入尺寸是 1024x1024。
如果推理小尺寸的图片(低于1024),它会缩放到该 1024 尺寸。
问题:
该模型支持输入小尺寸比如 768x768 或者 512x512 进行推理吗?如何修改
改小会不会提高一些速度呢,谢谢~
The text was updated successfully, but these errors were encountered: