Statistical learning theory and stochastic optimization : Ecole d'été de probabilités de Saint-Flour XXXI-2001 / Olivier Catoni ; Jean Picard

Auteur principal collectivité : école d'été de probabilités de Saint-Flour, 31, 2001, AuteurCo-auteur : Catoni, Olivier, 1965-, AuteurAuteur secondaire : Picard, Jean, 1959-, Editeur scientifiqueType de document : CongrèsCollection : Lecture notes in mathematics, 1851Langue : anglais.Pays: Allemagne.Éditeur : Berlin : Springer, 2004Description : 1 vol. (VIII-272 p.) : ill. ; 24 cmISBN: 9783540225720.ISSN: 0075-8434.Bibliographie : Bibliogr. p. 261-265. Index.Sujet MSC : 62B10, Statistics - Sufficiency and information, Statistical aspects of information-theoretic topics
62G05, Statistics - Nonparametric inference, Nonparametric estimation
62G07, Statistics - Nonparametric inference, Density estimation
62H30, Statistics - Multivariate analysis, Classification and discrimination; cluster analysis; mixture models
90C15, Mathematical programming, Stochastic programming
En-ligne : Springerlink
Holdings
Item type Current library Call number Status Date due Barcode
 Congrès Congrès CMI
Salle 1
Ecole STF (Browse shelf(Opens below)) Available 07804-01

Ce vol. est la 2ème partie de l'Ecole d'été de probabilités de Saint-Flour XXXI-2001. La 1ère partie a été publ. dans le vol. 1837 de la même coll.

Bibliogr. p. 261-265. Index

Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results. (Source : 4ème de couverture)

There are no comments on this title.

to post a comment.