-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Internal Error failure of TensorRT 8.6.1 when running tensorrt.OnnxParser.parse_from_file on GPU NVIDIA GeForce RTX 3060 #3728
Labels
triaged
Issue has been triaged by maintainers
Comments
dspyz-matician
changed the title
Intenral Error failure of TensorRT 8.6.1 when running tensorrt.OnnxParser.parse_from_file on GPU NVIDIA GeForce RTX 3060
Internal Error failure of TensorRT 8.6.1 when running tensorrt.OnnxParser.parse_from_file on GPU NVIDIA GeForce RTX 3060
Mar 21, 2024
Looks like a torch-onnx model, it exports an invalid model.
|
Could you please file a bug against pytorch? Thanks! |
Sorry I took so long to get around to this: pytorch/pytorch#123353 |
closing since no activity for more than 3 weeks per our policy, thanks all! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Description
I have a simple model I saved to onnx and then the OnnxParser failed with an ICE:
[03/21/2024-12:12:47] [TRT] [E] ModelImporter.cpp:774: --- End node ---
[03/21/2024-12:12:47] [TRT] [E] ModelImporter.cpp:777: ERROR: ModelImporter.cpp:195 In function parseGraph:
[6] Invalid Node - /Where
[graphShapeAnalyzer.cpp::checkCalculationStatusSanity::1503] Error Code 2: Internal Error (Assertion !isPartialWork(p.second.outputExtents) failed. )
Code is below; ONNX file is attached
Environment
TensorRT Version: 8.6.1
NVIDIA GPU: NVIDIA GeForce RTX 3060
NVIDIA Driver Version: 550.54.14
CUDA Version: 12.4
CUDNN Version: 8.9.7
Operating System: Debian 11
Python Version (if applicable): 3.9.2
PyTorch Version (if applicable): 2.2.1+cu121
Relevant Files
Model link:
traversability.zip
Steps To Reproduce
Commands or scripts:
Generating the onnx model:
Converting the onnx model:
Can this model run on other frameworks?:
It works just fine when running it directly from Python:
The text was updated successfully, but these errors were encountered: