Automatic Differentiation
 
Loading...
Searching...
No Matches
log_softmax.hpp
Go to the documentation of this file.
1#ifndef STAN_MATH_OPENCL_REV_LOG_SOFTMAX_HPP
2#define STAN_MATH_OPENCL_REV_LOG_SOFTMAX_HPP
3#ifdef STAN_OPENCL
4
9
10namespace stan {
11namespace math {
12
21template <typename T,
22 require_all_kernel_expressions_and_none_scalar_t<T>* = nullptr>
24 return make_callback_var(
25 log_softmax(A.val()), [A](vari_value<matrix_cl<double>>& res) mutable {
26 A.adj() += res.adj() - sum(res.adj()) * exp(res.val());
27 });
28}
29
30} // namespace math
31} // namespace stan
32
33#endif
34#endif
Represents an arithmetic matrix on the OpenCL device.
Definition matrix_cl.hpp:47
var_value< plain_type_t< T > > make_callback_var(T &&value, F &&functor)
Creates a new var initialized with a callback_vari with a given value and reverse-pass callback funct...
auto log_softmax(const T &x)
Return the log softmax of the specified vector or container of vectors.
The lgamma implementation in stan-math is based on either the reentrant safe lgamma_r implementation ...