-
Notifications
You must be signed in to change notification settings - Fork 677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
_trt.shape overflow when converting from pytorch to TRT #927
Comments
I've generated trt engine successfully through ONNX. |
Thanks for reaching out. Apologies for the delay. Glad to hear ONNX worked for you. FYI, not sure if this is what you did, but you can run In case the issue re-appears / I need to reproduce, is this a publicly available model? Best, |
@jaybdub |
@jaybdub
|
before operator converting, some input tensors had attributes called '_trt'
To deal with it, I've deleted these incorrect '_trt' attributes manually in related converter function in “naive_converters.py”.
However, I've encountered sone errors when doing things as follows, and it says:
Of course, the TRT module was not successfully generated, and when I try to save its state dict, it says:
Finally, here's my environment:
pytorch 2.3.0+cu121
torch2trt 0.5.0
tensorRT 8.6.1.6 (Also tried tensorRT 10.0.1.6, after renaming some covnerting method, I have the same issue)
Since it's my first issue on Github, I really appreciate it if someone could help, or offer some clue to solve it, please!!!!!!!!!!!
The text was updated successfully, but these errors were encountered: