gradient.check: how does it work? #176
-
Hi folks! It's mostly in the title, but I guess I was just trying to consult the documentation about what's going on when As a follow-up question, is it inherently problematic to have a Here are the model details:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 8 replies
-
If the gradient is not zero (by some tolerance) for some parameters, the model has not properly converged. This can happen due to a variety of reasons, for example, due to a poor model fit in one way or another. With many parameters it tends to be difficult to get the gradient for all parameters below some tolerance, and whether it really is problematic or not (in my experience) depends on the (number and type) of parameters that this applies to. You can check with parameters are not playing nice by checking model$TMBfn$gr() while using the names from the model$TMBfn$par object. Reltol values lower than machine accuracy as indicated in your post will not achieve anything. |
Beta Was this translation helpful? Give feedback.
gradient.check
uses the gradient, I.e. The vector of first derivatives to the likelihood, to check if the model has converged. At the maximum the surface is flat, so the gradient should be zero.If the gradient is not zero (by some tolerance) for some parameters, the model has not properly converged. This can happen due to a variety of reasons, for example, due to a poor model fit in one way or another.
With many parameters it tends to be difficult to get the gradient for all parameters below some tolerance, and whether it really is problematic or not (in my experience) depends on the (number and type) of parameters that this applies to.
You can check with parameters are not playing nice …