Loading [MathJax]/extensions/TeX/AMSsymbols.js
Automatic Differentiation
 
All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Modules Pages
Loading...
Searching...
No Matches

◆ grad() [5/5]

void stan::math::grad ( var v,
Eigen::Matrix< var, Eigen::Dynamic, 1 > &  x,
Eigen::VectorXd &  g 
)
inline

Propagate chain rule to calculate gradients starting from the specified variable.

Resizes the input vector to be the correct size.

The grad() function does not itself recover any memory. use recover_memory() or recover_memory_nested() to recover memory.

Parameters
[in]vValue of function being differentiated
[in]xVariables being differentiated with respect to
[out]gGradient, d/dx v, evaluated at x.

Definition at line 24 of file grad.hpp.