Compute the element-wise Kullback-Leibler Divergence between two probability distributions, predictions
and targets
.
The Kullback-Leibler Divergence is a measure of how one probability distribution diverges from a second, expected probability distribution. For discrete probability distributions P and Q, the KL divergence is defined as:
In this problem, you will compute the element-wise KL divergence before the summation step. That is, for each element:
Note that when targets[i]
is 0, the contribution to the KL divergence is 0 (by convention, using the limit ).
predictions
of size representing a probability distribution Q (all values > 0 and sum to 1)targets
of size representing a probability distribution P (all values ≥ 0 and sum to 1)output
of size , where output[i]
contains the element-wise KL divergence contribution.targets[i]
is 0 correctly (the contribution should be 0).GPU Type
Language
Data Type
Loading editor...
CUDA C++ environment
For the best coding experience, please switch to a desktop device to write and submit your solution.