-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't use gemm! methods with Metal #423
Comments
Metal.jl only implements well-known interfaces like For batched matmul, Metal.jl only provides the low-level |
Oh, I see. I was expecting that all Metal-related specialized methods would be grouped in this package. What's worse, I've encountered some inconsistent behavior, my small transformer network sometimes produces NaNs in outputs and sometimes runs just fine. I've tracked the problem up to the I'm not sure that I'll continue trying to make Metal backend work for me, I'll probably stick to CUDA for now, but let me know if I can help you with some information |
That's probably #381 I think we can close this issue then? It's probably worth opening an issue on NeuralAttentionlib.jl for Metal.jl support, either in the form of a package extension there that calls |
I'm trying to run some small transformer models on my Mac and I'm getting an error that it's not possible to convert
::Metal.MtlPtr{Float32}
to::Ptr{Float32}
which is happening ingemm!
operation. Here is the full log from the minimal example.The text was updated successfully, but these errors were encountered: