Mathematical Foundations of Infinite-Dimensional Statistical Models
In nonparametric and high-dimensional statistical models, the classical Gauss–Fisher–Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
- Describes the theory of statistical inference in statistical models with an infinite-dimensional parameter space
- Develops a mathematically coherent and objective approach to statistical inference
- Much of the material arises from courses taught by the authors at the beginning and advanced graduate level; each section ends with exercises
Product details
June 2021Paperback
9781108994132
704 pages
251 × 176 × 36 mm
1.27kg
Temporarily unavailable - available from TBC
Table of Contents
- Preface
- 1. Nonparametric statistical models
- 2. Gaussian processes
- 3. Empirical processes
- 4. Function spaces and approximation theory
- 5. Linear nonparametric estimators
- 6. The minimax paradigm
- 7. Likelihood-based procedures
- 8. Adaptive inference
- References
- Author Index
- Index.