Multivariate AnalysisMultivariate Analysis deals with observations on more than one variable where there is some inherent interdependence between variables. Most available books on the subject concentrate on either the theoretical or the data analytic approach. This book not only combines theses two approaches but also emphasizes modern developments, so, although primarily designed as a textbook for final year undergraduates and postgraduate students in mathematics and statistics, certain of the sections will commend themselves to research workers. Broadly speaking the first half of the book contains direct extensions of univariate ideas and techniques, including exploratory data analysis, distribution theory and problems of inference. The remaining chapters concentrate on specifically multivariate problems which have no meaningful analogues in the univariate case. Topics covered include econometrics, principal component analysis, factor analysis, canonical correlation analysis, discriminate analysis, cluster analysis, multi-dimensional scaling and directional data. Several new methods of presentation are used, for example, the data matrix is emphasized throughout, and density-free approach is given to normal theory, tests are constructed using the likelihood ratio principle and the union intersection principle, and graphical methods are used in explanation. The reader is assumed to have a basic knowledge of mathematical statistics at an undergraduate level together with an elementary understanding of linear algebra. There are, however, appendices which provide a sufficient background of matrix algebra, a summary of univariate statistics and some statistical tables. |
Other editions - View all
Common terms and phrases
2SLS A₁ asymptotic C₁ canonical correlation clusters columns confidence intervals Consider Corollary correlation coefficient correlation matrix corresponding covariance matrix data matrix defined denote diag discriminant rule eigenvalues eigenvectors elements endogenous variables equal Example Exercise factor function given H₁ Hence independent instrumental variables least squares likelihood function likelihood ratio test linear combination log likelihood M₁ Mahalanobis distance Mardia maximized mean vector measure method multinormal multivariate n₁ n₂ non-singular non-zero eigenvalues normal distribution Note null hypothesis OLS estimator orthogonal orthogonal matrix p-vector P₁ P₂ parameters partition plim population principal component analysis problem Proof r₁ random sample random vector rank regression rejected result rows S₁ sample covariance matrix score Section SSP matrix Suppose symmetric Table test statistic Theorem tion transformation unbiased estimator uncorrelated univariate variance Wishart distribution x₁ y₁ zero