mushi.loss_functions.dkl
- dkl(E, X)[source]
Generalized Kullback-Liebler divergence, a Bregman divergence (ignores constant term)
\[\sum_{i,j} \left(X_{i, j} \log \frac{X_{i, j}}{\mathbb{E}[X_{i, j}]} - X_{i, j} + \mathbb{E}[X_{i, j}] \right)\]- Parameters
E (
ndarray
) – expectation \(\mathbb{E}[\mathbf X]\)X (
ndarray
) – data \(\mathbf X\)
- Return type
float64