Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-zero status code returned while running Trilu node. #27

Open
zhanweiw opened this issue Sep 3, 2023 · 4 comments
Open

Non-zero status code returned while running Trilu node. #27

zhanweiw opened this issue Sep 3, 2023 · 4 comments

Comments

@zhanweiw
Copy link

zhanweiw commented Sep 3, 2023

I run this code on my Lenovo T14S(x64 Windows 11) on Direct ML and got the issue below. May I get your support on the reason? Thanks in advance!
https://github.com/cassiebreviu/StableDiffusion/blob/direct-ML-EP

Microsoft.ML.OnnxRuntime.OnnxRuntimeException
  HResult=0x80131500
  Message=[ErrorCode:RuntimeException] Non-zero status code returned while running Trilu node. Name:'Trilu_233' Status Message: D:\a\_work\1\s\onnxruntime\core\providers\dml\DmlExecutionProvider\src\MLOperatorAuthorImpl.cpp(2340)\onnxruntime.DLL!00007FF9AB78E70A: (caller: 00007FF9AB78DC5B) Exception(3) tid(7118) 80070057 The parameter is incorrect.

  Source=Microsoft.ML.OnnxRuntime
  StackTrace:
   at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus)
   at Microsoft.ML.OnnxRuntime.InferenceSession.RunImpl(RunOptions options, IntPtr[] inputNames, IntPtr[] inputValues, IntPtr[] outputNames, DisposableList`1 cleanupList)
   at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection`1 inputs, IReadOnlyCollection`1 outputNames, RunOptions options)
   at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection`1 inputs, IReadOnlyCollection`1 outputNames)
   at Microsoft.ML.OnnxRuntime.InferenceSession.Run(IReadOnlyCollection`1 inputs)
   at StableDiffusion.ML.OnnxRuntime.TextProcessing.TextEncoder(Int32[] tokenizedInput, StableDiffusionConfig config) in C:\Source\StableDiffusion-DML\StableDiffusion.ML.OnnxRuntime\TextProcessing.cs:line 85
   at StableDiffusion.ML.OnnxRuntime.TextProcessing.PreprocessText(String prompt, StableDiffusionConfig config) in C:\Source\StableDiffusion-DML\StableDiffusion.ML.OnnxRuntime\TextProcessing.cs:line 12
   at StableDiffusion.ML.OnnxRuntime.UNet.Inference(String prompt, StableDiffusionConfig config) in C:\Source\StableDiffusion-DML\StableDiffusion.ML.OnnxRuntime\UNet.cs:line 74
   at StableDiffusion.Program.Main(String[] args) in C:\Source\StableDiffusion-DML\StableDiffusion\Program.cs:line 37
@zhanweiw zhanweiw changed the title The branch 'direct-ML-EP' config should be changed to 'ExecutionProvider.DirectML'. The branch 'direct-ML-EP' config should be changed to use 'ExecutionProvider.DirectML'. Sep 3, 2023
@zhanweiw zhanweiw changed the title The branch 'direct-ML-EP' config should be changed to use 'ExecutionProvider.DirectML'. Non-zero status code returned while running Trilu node. Sep 3, 2023
@cassiebreviu
Copy link
Owner

This seems like a version issue for support for that operator. What versions are you using of ort and directml?

@songshizhao
Copy link

use same nuget pkg in your prj like .dll prj solved this.

@dionfoster
Copy link

I am also receiving this error message. Running on windows 11.

use same nuget pkg in your prj like .dll prj solved this.

What does this mean? Here are the versions of the package references for the solution:

  • Microsoft.ML - 2.0.1
  • Microsoft.ML.OnnxRuntime.DirectML - 1.14.1
  • Microsoft.ML.OnnxRuntime.Managed - 1.14.1

@songshizhao
Copy link

I am also receiving this error message. Running on windows 11.

use same nuget pkg in your prj like .dll prj solved this.

What does this mean? Here are the versions of the package references for the solution:

  • Microsoft.ML - 2.0.1
  • Microsoft.ML.OnnxRuntime.DirectML - 1.14.1
  • Microsoft.ML.OnnxRuntime.Managed - 1.14.1
    you have your app prj ref a dll prj [like this]
    ref pkg above in your app prj even if not necessary

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants