Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
st: RE: Searching for Kullback-Leiber divergence
From
"Nick Cox" <[email protected]>
To
<[email protected]>
Subject
st: RE: Searching for Kullback-Leiber divergence
Date
Sun, 9 May 2010 17:15:24 +0100
I suspect that you will need to program it yourself.
The correct spelling is Leibler, not Leiber, but even running -findit-
with correct author names finds nothing. (Mind you, it doesn't find
-multgof-, either.)
Nick
[email protected]
Michael C. Morrison
Tirthankar Chakravarty advised that I look into -multigof- for the
Kullback-Leiber divergence. Thanks for the response but -multigof- is
not what I'm looking for.
Kullback-Leiber divergence is sometimes referred to as 'relative
entropy' or 'cross entropy'. The Kullback-Leiber divergence that I need
summarizes the effect of location and shape changes on the overall
relative distribution involving two continuous distributions. The
Kullback-Leiber divergence has a simple interpretation in terms of the
relative distribution, and it is decomposable into the location, shape
and other components.
I have - reldist-. It does a great job in plotting relative &
cumulative pdfs, location/shape shift changes, polarization
coefficients, but it doesn't provide a measure of the overall
distributional difference between two distributions. That's where the
The Kullback-Leiber divergence comes to the rescue. The advantage of the
Kullback-Leiber divergence is that it is decomposable.
Hope this clarifies what I'm searching for.
*
* For searches and help try:
* http://www.stata.com/help.cgi?search
* http://www.stata.com/support/statalist/faq
* http://www.ats.ucla.edu/stat/stata/