Facebook Instagram Twitter RSS Feed PodBean Back to top on side

Estimation of entropies and divergences via nearest neighbours

In: Tatra Mountains Mathematical Publications, vol. 39, no. 1
Nikolai Leonenko - Luc Pronzato - Vippal Savani
Detaily:
Rok, strany: 2008, 265 - 273
O článku:
We extend the results in [L. F. Kozachenko, N. N. Leonenko: {\em On statistical estimation of entropy of random vector}, Problems Inform. Transmission \textbf{23} (1987), 95–101; Translated from Problemy Peredachi Informatsii 23 (1987),9–16 (in Russian)] and [M. N. Goria, N. N. Leonenko, V. V. Mergel, P. L. Novi Inverardi: {\em A new class of random vector entropy estimators and its applications in testing statistical hypotheses}, J. Nonparametr. Statist. \textbf{17} (2005), 277–297] and show how $k$th nearest-neighbor distances in a sample of $N$ i.i.d. vectors distributed with the probability density $f$ can be used to estimate consistently R
Ako citovať:
ISO 690:
Leonenko, N., Pronzato, L., Savani, V. 2008. Estimation of entropies and divergences via nearest neighbours. In Tatra Mountains Mathematical Publications, vol. 39, no.1, pp. 265-273. 1210-3195.

APA:
Leonenko, N., Pronzato, L., Savani, V. (2008). Estimation of entropies and divergences via nearest neighbours. Tatra Mountains Mathematical Publications, 39(1), 265-273. 1210-3195.