E-Book, Englisch, 538 Seiten
Harville Linear Models and the Relevant Distributions and Matrix Algebra
Erscheinungsjahr 2018
ISBN: 978-1-351-26466-2
Verlag: Taylor & Francis
Format: EPUB
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
E-Book, Englisch, 538 Seiten
Reihe: Chapman & Hall/CRC Texts in Statistical Science
ISBN: 978-1-351-26466-2
Verlag: Taylor & Francis
Format: EPUB
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
Linear Models and the Relevant Distributions and Matrix Algebra provides in-depth and detailed coverage of the use of linear statistical models as a basis for parametric and predictive inference. It can be a valuable reference, a primary or secondary text in a graduate-level course on linear models, or a resource used (in a course on mathematical statistics) to illustrate various theoretical concepts in the context of a relatively complex setting of great practical importance.
Features:
- Provides coverage of matrix algebra that is extensive and relatively self-contained and does so in a meaningful context
- Provides thorough coverage of the relevant statistical distributions, including spherically and elliptically symmetric distributions
- Includes extensive coverage of multiple-comparison procedures (and of simultaneous confidence intervals), including procedures for controlling the k-FWER and the FDR
- Provides thorough coverage (complete with detailed and highly accessible proofs) of results on the properties of various linear-model procedures, including those of least squares estimators and those of the F test.
- Features the use of real data sets for illustrative purposes
- Includes many exercises
David Harville served for 10 years as a mathematical statistician in the Applied Mathematics Research Laboratory of the Aerospace Research Laboratories at Wright-Patterson AFB, Ohio, 20 years as a full professor in Iowa State University’s Department of Statistics where he now has emeritus status, and seven years as a research staff member of the Mathematical Sciences Department of IBM’s T.J. Watson Research Center. He has considerable relevant experience, having taught M.S. and Ph.D. level courses in linear models, been the thesis advisor of 10 Ph.D. graduates, and authored or co-authored two books and more than 80 research articles. His work has been recognized through his election as a Fellow of the American Statistical Association and of the Institute of Mathematical Statistics and as a member of the International Statistical Institute.
Autoren/Hrsg.
Weitere Infos & Material
Preface
1 Introduction
Linear Statistical Models
Regression Models
Classificatory Models
Hierarchical Models and Random-EffectsModels
Statistical Inference
An Overview
2 Matrix Algebra: a Primer
The Basics
Partitioned Matrices and Vectors
Trace of a (Square) Matrix
Linear Spaces
Inverse Matrices
Ranks and Inverses of Partitioned Matrices
OrthogonalMatrices
IdempotentMatrices
Linear Systems
Generalized Inverses
Linear Systems Revisited
Projection Matrices
Quadratic Forms
Determinants
Exercises
Bibliographic and Supplementary Notes
3 Random Vectors and Matrices
Expected Values
Variances, Covariances, and Correlations
Standardized Version of a Random Variable
Conditional Expected Values and Conditional Variances and Covariances
Multivariate Normal Distribution
Exercises
Bibliographic and Supplementary Notes
4 The General Linear Model
Some Basic Types of Linear Models
Some Specific Types of Gauss-Markov Models (With Examples)
Regression
Heteroscedastic and Correlated Residual Effects
Multivariate Data
vi Contents
Exercises
Bibliographic and Supplementary Notes
5 Estimation and Prediction: Classical Approach
Linearity and Unbiasedness
Translation Equivariance
Estimability
The Method of Least Squares
Best LinearUnbiased or Translation-EquivariantEstimation of Estimable Functions
(Under the G-M Model)
Simultaneous Estimation
Estimation of Variability and Covariability
Best (Minimum-Variance) Unbiased Estimation
Likelihood-Based Methods
Prediction
Exercises
Bibliographic and Supplementary Notes
6 Some Relevant Distributions and Their Properties
Chi-Square, Gamma, Beta, and Dirichlet Distributions
Noncentral Chi-Square Distribution
Central and Noncentral F Distributions
Central, Noncentral, and Multivariate t Distributions
Moment Generating Function of the Distribution of One or More Quadratic Forms
or Second-Degree Polynomials (in a Normally Distributed Random Vector)
Distribution of Quadratic Forms or Second-Degree Polynomials (in a Normally
Distributed Random Vector): Chi-Squareness
The Spectral Decomposition, With Application to the Distribution of Quadratic
Forms
More on the Distribution of Quadratic Forms or Second-Degree Polynomials (in a
Normally Distributed Random Vector)
Exercises
Bibliographic and Supplementary Notes
7 Confidence Intervals (or Sets) and Tests of Hypotheses
"Setting the Stage": Response Surfaces in the Context of a Specific Application and
in General
Augmented G-M Model
The F Test (and Corresponding Confidence Set) and the S Method
Some Optimality Properties
One-Sided t Tests and the Corresponding Confidence Bounds
The Residual Variance: Confidence Intervals and Tests
Multiple Comparisons and Simultaneous Confidence Intervals: Some Enhancements
Prediction
Exercises
Bibliographic and Supplementary Notes
References
Index