Kullback Leibler Divergence
Refer to this link.
Often in ML, we are interested in calculating the difference between the actual and predicted probability distributions.
We can do this using KL Divergence. It calculates a score that measures the divergence of one probability distribution from another.
The KL divergence between two distributions Q and P is often stated using the following notation:
Where the “||” operator indicates “divergence” or Ps divergence from Q.
Code Implementation
Last updated