Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MethodError on length(::DiffResults.ImmutableDiffResult{1, Float64, Tuple{Float64}}) #265

Open
penelopeysm opened this issue Oct 25, 2024 · 0 comments

Comments

@penelopeysm
Copy link

penelopeysm commented Oct 25, 2024

Hello, here to report another possible bug found upstream in Turing. In the following example, f differentiates just fine but g doesn't.

import ReverseDiff

f(x) = exp.(x[])
f([1.0])
ReverseDiff.gradient(f, [1.0])

g(x) = exp.(reshape(vec(x), ()))
g([1.0])
ReverseDiff.gradient(g, [1.0])

Edit: h(x) = exp.(fill(x[], ())) works fine too, but j(x) = exp.(reshape(x, ())) doesn't.

Stack trace

julia> ReverseDiff.gradient(g, [1.0])
ERROR: MethodError: no method matching length(::DiffResults.ImmutableDiffResult{1, Float64, Tuple{Float64}})
The function `length` exists, but no method is defined for this combination of argument types.

Closest candidates are:
  length(::Cmd)
   @ Base process.jl:716
  length(::Base.MethodSpecializations)
   @ Base reflection.jl:1317
  length(::Core.SimpleVector)
   @ Base essentials.jl:933
  ...

Stacktrace:
  [1] _similar_shape(itr::DiffResults.ImmutableDiffResult{1, Float64, Tuple{Float64}}, ::Base.HasLength)
    @ Base ./array.jl:652
  [2] _collect(cont::UnitRange{…}, itr::DiffResults.ImmutableDiffResult{…}, ::Base.HasEltype, isz::Base.HasLength)
    @ Base ./array.jl:711
  [3] collect(itr::DiffResults.ImmutableDiffResult{1, Float64, Tuple{Float64}})
    @ Base ./array.jl:705
  [4] broadcastable(x::DiffResults.ImmutableDiffResult{1, Float64, Tuple{Float64}})
    @ Base.Broadcast ./broadcast.jl:707
  [5] broadcasted
    @ ./broadcast.jl:1318 [inlined]
  [6] broadcast(f::ReverseDiff.ForwardOptimize{…}, x::ReverseDiff.TrackedArray{…})
    @ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/elementwise.jl:237
  [7] broadcast
    @ ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/elementwise.jl:198 [inlined]
  [8] _materialize
    @ ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/broadcast.jl:265 [inlined]
  [9] materialize
    @ ~/.julia/packages/ReverseDiff/p1MzG/src/derivatives/broadcast.jl:273 [inlined]
 [10] g(x::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}})
    @ Main ./REPL[6]:1
 [11] ReverseDiff.GradientTape(f::typeof(g), input::Vector{…}, cfg::ReverseDiff.GradientConfig{…})
    @ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/tape.jl:199
 [12] gradient(f::Function, input::Vector{Float64}, cfg::ReverseDiff.GradientConfig{ReverseDiff.TrackedArray{…}})
    @ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/gradients.jl:22
 [13] top-level scope
    @ REPL[8]:1
Some type information was truncated. Use `show(err)` to see complete types.

Version info

(ppl) pkg> st
Status `~/ppl/Project.toml`
  [37e2e3b7] ReverseDiff v1.15.3

julia> versioninfo()
Julia Version 1.11.1
Commit 8f5b7ca12ad (2024-10-16 10:53 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: macOS (arm64-apple-darwin22.4.0)
  CPU: 10 × Apple M1 Pro
  WORD_SIZE: 64
  LLVM: libLLVM-16.0.6 (ORCJIT, apple-m1)
Threads: 1 default, 0 interactive, 1 GC (on 8 virtual cores)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant