-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[e2e][onnx][model] onnx.Expand #636
Comments
Maybe this is an issue with the importer? Relevant lines of imported IR: %305 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1> : tensor<si64>} : () ->!torch.vtensor<[],si64>
%648 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__246> : tensor<si64>} : () ->!torch.vtensor<[],si64>
%2001 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1054> : tensor<1xsi64>} : () ->!torch.vtensor<[1],si64>
%2002 = torch.operator "onnx.Unsqueeze"(%2000, %2001) : (!torch.vtensor<[2,?,?],f32>, !torch.vtensor<[1],si64>) ->!torch.vtensor<[1,2,?,?],f32>
%2003 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1055> : tensor<1xsi64>} : () ->!torch.vtensor<[1],si64>
%2004 = torch.operator "onnx.Unsqueeze"(%648, %2003) : (!torch.vtensor<[],si64>, !torch.vtensor<[1],si64>) ->!torch.vtensor<[1],si64>
%2005 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1056> : tensor<1xsi64>} : () ->!torch.vtensor<[1],si64>
%2006 = torch.operator "onnx.Unsqueeze"(%305, %2005) : (!torch.vtensor<[],si64>, !torch.vtensor<[1],si64>) ->!torch.vtensor<[1],si64>
%2007 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1057> : tensor<1xsi64>} : () ->!torch.vtensor<[1],si64>
%2008 = torch.operator "onnx.Unsqueeze"(%305, %2007) : (!torch.vtensor<[],si64>, !torch.vtensor<[1],si64>) ->!torch.vtensor<[1],si64>
%2009 = torch.operator "onnx.Constant"() {torch.onnx.value = dense_resource<__1058> : tensor<1xsi64>} : () ->!torch.vtensor<[1],si64>
%2010 = torch.operator "onnx.Unsqueeze"(%305, %2009) : (!torch.vtensor<[],si64>, !torch.vtensor<[1],si64>) ->!torch.vtensor<[1],si64>
%2011 = torch.operator "onnx.Concat"(%2004, %2006, %2008, %2010) {torch.onnx.axis = 0 : si64} : (!torch.vtensor<[1],si64>, !torch.vtensor<[1],si64>, !torch.vtensor<[1],si64>, !torch.vtensor<[1],si64>) -> !torch.vtensor<[4],si64>
%2039 = torch.operator "onnx.Shape"(%2011) : (!torch.vtensor<[4],si64>) -> !torch.vtensor<[1],si64>
%2040 = torch.operator "onnx.ConstantOfShape"(%2039) {torch.onnx.value = dense_resource<__1071> : tensor<1xsi64>} : (!torch.vtensor<[1],si64>) -> !torch.vtensor<[?],si64>
%2041 = torch.operator "onnx.Expand"(%2002, %2040) : (!torch.vtensor<[1,2,?,?],f32>, !torch.vtensor<[?],si64>) -> !torch.vtensor<[],f32>
{-#
dialect_resources: {
builtin: {
__1: "0x080000000100000000000000",
__246: "0x080000000100000000000000",
__1055: "0x080000000000000000000000",
__1056: "0x080000000000000000000000",
__1057: "0x080000000000000000000000",
__1058: "0x080000000000000000000000",
__1071: "0x080000000100000000000000",
}
}
#-} I tried updating the opset version to see if there is any new shape inference for these ops, but still encountered the same issue. It seems like a weird expand anyway, unless _1071 actually represents 2, and the dynamic dims of %2002 are actually either 1 or 2. |
I looks like I am able to unblock this issue with the change in torch-mlir PR#3280. I'm not sure if there is any reason not to include data propogation with onnx shape inference. If there is some reason to generally avoid using it, I can change it to a command-line flag for the onnx importer. |
Signature to be supported in RAFT_vaiq_int8 model
%2041 = torch.operator "onnx.Expand"(%2002, %2040) : (!torch.vtensor<[1,2,?,?],f32>, !torch.vtensor<[?],si64>) -> !torch.vtensor<[],f32>
The following signature also exist in RAFT_vaiq_int8 model, but it should have been supported by current onnx expand implementation.
%2062 = torch.operator "onnx.Expand"(%2061, %2058) : (!torch.vtensor<[?,?],si64>, !torch.vtensor<[2],si64>) -> !torch.vtensor<[?,?],si64>
To repeat this error:
The text was updated successfully, but these errors were encountered: