Information theory, inference, and learning algorithms / David J. C. MacKay

Auteur principal : MacKay, David J. C., 1967-, AuteurType de document : Livre numériqueLangue : anglais.Éditeur : Cambridge University Press, Cambridge, 2003ISBN: 9780521642989.Sujet MSC : 94-01, Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory
62F10, Statistics - Parametric inference, Point estimation
62H30, Statistics - Multivariate analysis, Classification and discrimination; cluster analysis; mixture models
68P30, Computer science - Theory of data, Coding and information theory
En-ligne : sur le site de l'auteur
Tags from this library: No tags from this library for this title. Log in to add tags.
No physical items for this record

This textbook is geared toward upper-level undergraduate students and graduate students in engineering, science, mathematics, and computer science. Prerequisites for students using this book are a good working knowledge of calculus, probability theory and linear algebra. This textbook is not just any textbook on information theory. Besides the usual topics covered in a conventional course on information theory (Shannon theory, parctical solutions to communication problems, etc.), this text brings Bayesian data modelling, Monte Carlo methods, variational methods, clustering algorithms, and neural networks. The author takes the point-of-view that information theory and machine learning belong together. He manages to convince the reader of this viewpoint by continually showing the connections between seemingly unrelated topics. As Bob McEliece is quoted, “An instant classic. ... You'll want two copies of this astonishing book, one for the office and one for the fireside at home.” (Zentralblatt)

There are no comments on this title.

to post a comment.