Estimation and testing under sparsity : école d'été de probabilités de Saint-Flour XLV - 2015 / Sara van de Geer

Auteur principal collectivité : école d'été de probabilités de Saint-Flour, 45, 2015, AuteurCo-auteur : Geer, Sara A. van de, 1958-, AuteurType de document : CongrèsCollection : Lecture notes in mathematics, 2159Langue : anglais.Pays: Swisse.Éditeur : Cham : Springer, 2016Description : 1 vol. (XIII-274 p.) : fig. ; 24 cmISBN: 9783319327730.ISSN: 0075-8434.Bibliographie : Bibliogr. p. 267-269. Index.Sujet MSC : 60-02, Research exposition (monographs, survey articles) pertaining to probability theory
60F05, Limit theorems in probability theory, Central limit and other weak theorems
60F17, Limit theorems in probability theory, Functional limit theorems; invariance principles
62J07, Statistics - Linear inference, regression, Ridge regression; shrinkage estimators (Lasso)
62J12, Statistics - Linear inference, regression, Generalized linear models (logistic models)
En-ligne : Springerlink - résumé | zbMath | MSN
Tags from this library: No tags from this library for this title. Log in to add tags.
Holdings
Item type Current library Call number Status Date due Barcode
 Congrès Congrès CMI
Salle 1
Ecole STF (Browse shelf(Opens below)) Available 12441-01

Bibliogr. p. 267-269. Index

The book deals with models of high-dimensional data, that is models where the number of parameters to be estimated is larger than the number of observations available for parameter estimation. Nowadays, such models are very important, as due to the significant technological advances large volumes of observations can, and are often recorded (through internet, cameras, smartphones, etc.). In addition, the parameter set may be sparse, that is the number of really relevant parameters is smaller than the number of the observations, but no one knows how many they are beforehand. An important technique when dealing with parameter estimation in such high-dimensional models is the Lasso method. The book uses this method as the starting point and the basis for the understanding of other methods also presented and discussed, such as those inducing structured sparsity or low rank or those based on more general loss functions. The book provides several examples and illustrations of the methods presented and discussed, while each of its 17 chapters ends with a problem section. Thus, it can be used as textbook for students mainly at postgraduate level. (zbMath)

There are no comments on this title.

to post a comment.