# High-dimensional $p$-norms

High-dimensional $p$-norms - Download this document for free, or read online. Document in PDF available to download.

1 LSTA - Laboratoire de Statistique Théorique et Appliquée 2 LPMA - Laboratoire de Probabilités et Modèles Aléatoires 3 DMA - Département de Mathématiques et Applications 4 CLASSIC - Computational Learning, Aggregation, Supervised Statistical, Inference, and Classification DMA - Département de Mathématiques et Applications, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt 5 Applied Economics and Statistics

Abstract : Let $\bX=X 1, \hdots, X d$ be a $\mathbb R^d$-valued random vector with i.i.d.~components, and let $\Vert\bX\Vert p= \sum {j=1}^d|X j|^p^{1-p}$ be its $p$-norm, for $p>0$. The impact of letting $d$ go to infinity on $\Vert\bX\Vert p$ has surprising consequences, which may dramatically affect high-dimensional data processing. This effect is usually referred to as the {\it distance concentration phenomenon} in the computational learning literature. Despite a growing interest in this important question, previous work has essentially characterized the problem in terms of numerical experiments and incomplete mathematical statements. In the present paper, we solidify some of the arguments which previously appeared in the literature and offer new insights into the phenomenon.

Keywords : $p$-norms high dimension asymptotic statistics

Author: Gérard Biau - David Mason -

Source: https://hal.archives-ouvertes.fr/