Warenkorb
€ 0,00 0 Buch dabei,
portofrei
Linear Regression als Buch
PORTO-
FREI

Linear Regression

'Lecture Notes in Statistics'. Softcover reprint of the original 1st ed. 2003. Book. Sprache: Englisch.
Buch (kartoniert)
Ihr 12%-Rabatt auf alle Spielwaren, Hörbücher, Filme, Musik u.v.m
 
12% Rabatt sichern mit Gutscheincode: KINDER12
 
The book covers the basic theory of linear regression models and presents a comprehensive survey of different estimation techniques as alternatives and complements to least squares estimation. The relationship between different estimators is clearly … weiterlesen
Buch

176,49*

inkl. MwSt.
Portofrei
Sofort lieferbar
Linear Regression als Buch

Produktdetails

Titel: Linear Regression
Autor/en: Jürgen Groß, J. Groß

ISBN: 3540401784
EAN: 9783540401780
'Lecture Notes in Statistics'.
Softcover reprint of the original 1st ed. 2003.
Book.
Sprache: Englisch.
Springer Berlin Heidelberg

25. Juli 2003 - kartoniert - 412 Seiten

Beschreibung

The book covers the basic theory of linear regression models and presents a comprehensive survey of different estimation techniques as alternatives and complements to least squares estimation. The relationship between different estimators is clearly described and categories of estimators are worked out in detail. Proofs are given for the most relevant results, and the presented methods are illustrated with the help of numerical examples and graphics. Special emphasis is laid on the practicability, and possible applications are discussed. The book is rounded off by an introduction to the basics of decision theory and an appendix on matrix algebra.

Inhaltsverzeichnis

I Point Estimation and Linear Regression.- Fundamentals.- 1.1 Linear Models.- 1.1.1 Application of Linear Models.- 1.1.2 Types of Linear Models.- 1.1.3 Proceeding with Linear Models.- 1.1.4 A Preliminary Example.- 1.2 Decision Theory and Point Estimation.- 1.2.1 Decision Rule.- 1.2.2 Non-operational Decision Rule.- 1.2.3 Loss and Risk.- 1.2.4 Choosing a Decision Rule.- 1.2.5 Admissibility.- 1.2.6 Squared Error Loss.- 1.2.7 Matrix Valued Squared Error Loss.- 1.2.8 Alternative Loss Functions.- 1.3 Problems.- The Linear Regression Model.- 2.1 Assumptions.- 2.2 Ordinary Least Squares Estimation.- 2.2.1 The Principle of Least Squares.- 2.2.2 Coefficient of Determination R2.- 2.2.3 Predictive Loss.- 2.2.4 Least Squares Variance Estimator.- 2.2.5 Properties of the Ordinary Least Squares Estimator.- 2.2.6 Properties Under Normality.- 2.3 Optimality of Least Squares Estimation.- 2.3.1 Linear Unbiased Estimation.- 2.3.2 Gauss-Markov Theorem.- 2.3.3 Normality Assumption.- 2.3.4 Admissibility.- 2.4 Unreliability of Least Squares Estimation.- 2.4.1 Estimation of the Covariance Matrix.- 2.4.2 Unbiased Versus Biased Estimation.- 2.4.3 Collinearity.- 2.4.4 Consistency.- 2.4.5 Biased Estimation.- 2.5 Inadmissibility of the Ordinary Least Squares Estimator.- 2.5.1 The Reparameterized Regression Model.- 2.5.2 Risk Comparison of Least Squares and Stein Estimator.- 2.5.3 An Example for Stein Estimation.- 2.5.4 Admissibility.- 2.6 Problems.- II Alternatives to Least Squares Estimation.- Alternative Estimators.- 3.1 Restricted Least Squares Estimation.- 3.1.1 The Principle of Restricted Least Squares.- 3.1.2 The Parameter Space.- 3.1.3 Properties of Restricted Least Squares Estimator.- 3.1.4 Risk Comparison of Restricted and Ordinary Least Squares Estimator.- 3.1.5 Pretest Estimation.- 3.2 Other Types of Restriction.- 3.2.1 Stochastic Linear Restrictions.- 3.2.2 Inequality Restrictions.- 3.2.3 Elliptical Restrictions.- 3.3 Principal Components Estimator.- 3.3.1 Preliminary Considerations.- 3.3.2 Properties of the Principal Components Estimator.- 3.3.3 Drawbacks of the Principal Components Estimator.- 3.3.4 The Marquardt Estimator.- 3.4 Ridge Estimator.- 3.4.1 Preliminary Considerations.- 3.4.2 Properties of the Linear Ridge Estimator.- 3.4.3 The Choice of the Ridge Parameter.- 3.4.4 Standardization.- 3.4.5 Ridge and Restricted Least Squares Estimator.- 3.4.6 Ridge and Principal Components Estimator.- 3.4.7 Jackknife Modified Ridge Estimator.- 3.4.8 Iteration Estimator.- 3.4.9 An Example for Ridge Estimation.- 3.5 Shrinkage Estimator.- 3.5.1 Preliminary Considerations.- 3.5.2 Risk Comparison to Ordinary Least Squares.- 3.5.3 The Choice of the Shrinkage Parameter.- 3.5.4 Direction Modified Shrinkage Estimators.- 3.6 General Ridge Estimator.- 3.6.1 A Class of Estimators.- 3.6.2 Risk Comparison of General Ridge and Ordinary Least Squares Estimator.- 3.7 Linear Minimax Estimator.- 3.7.1 Preliminary Considerations.- 3.7.2 Inequality Restrictions.- 3.7.3 Linear Minimax Solutions.- 3.7.4 Alternative Approaches.- 3.7.5 Admissibility.- 3.8 Linear Bayes Estimator.- 3.8.1 Preliminary Considerations.- 3.8.2 Characterization of Linear Bayes Estimators.- 3.8.3 Non-Operational Bayes Solutions.- 3.8.4 A-priori Assumptions.- 3.9 Robust Estimator.- 3.9.1 Preliminary Considerations.- 3.9.2 Weighted Least Squares Estimation.- 3.9.3 The l1 Estimator.- 3.9.4 M Estimator.- 3.9.5 Robust Ridge Estimator.- 3.10 Problems.- Linear Admissibility.- 4.1 Preliminary Considerations.- 4.2 Linear Admissibility in the Non-Restricted Model.- 4.2.1 Linear Admissibility in the Simple Mean Shift Model.- 4.2.2 Characterization of Linearly Admissible Estimators.- 4.2.3 Ordinary Least Squares and Linearly Admissible Estimator.- 4.2.4 Linear Transforms of Ordinary Least Squares Estimator.- 4.2.5 Linear Admissibility of Known Estimators.- 4.2.6 Shrinkage Property and Linear Admissibility.- 4.2.7 Convex Combination of Estimators.- 4.2.8 Linear Bayes Estimator.- 4.3 Linear Admissibility Under Linear Restrictions.- 4.3.1 The Assumption of a Full Rank Restriction Matrix.- 4.3.2 Restricted Estimator.- 4.3.3 Characterization of Linearly Admissible Estimators.- 4.4 Linear Admissibility Under Elliptical Restrictions.- 4.4.1 Characterization of Linearly Admissible Estimators.- 4.4.2 Linear Admissibility of Certain Linear Estimators.- 4.4.3 Admissible Improvements Over Ordinary Least Squares.- 4.5 Problems.- III Miscellaneous Topics.- The Covariance Matrix of the Error Vector.- 5.1 Estimation of the Error Variance.- 5.1.1 The Sample Variance.- 5.1.2 Nonnegative Unbiased Estimation.- 5.1.3 Optimality of the Least Squares Variance Estimator.- 5.1.4 Non-Admissibility of the Least Squares Variance Estimator.- 5.2 Non-Scalar Covariance Matrix.- 5.2.1 Preliminary Considerations.- 5.2.2 The Transformed Model.- 5.2.3 Two-Stage Estimation.- 5.3 Occurrence of Non-Scalar Covariance Matrices.- 5.3.1 Seemingly Unrelated Regression.- 5.3.2 Heteroscedastic Errors.- 5.3.3 Equicorrelated Errors.- 5.3.4 Autocorrelated Errors.- 5.4 Singular Covariance Matrices.- 5.5 Equality of Ordinary and Generalized Least Squares.- 5.6 Problems.- Regression Diagnostics.- 6.1 Selecting Independent Variables.- 6.1.1 Mallows' Cp.- 6.1.2 Stepwise Regression.- 6.1.3 Alternative Criteria.- 6.2 Assessing Goodness of Fit.- 6.3 Diagnosing Collinearity.- 6.3.1 Variance Inflation Factors.- 6.3.2 Scaled Condition Indexes.- 6.4 Inspecting Residuals.- 6.4.1 Normal Quantile Plot.- 6.4.2 Residuals Versus Fitted Values Plot.- 6.4.3 Further Residual Plots.- 6.5 Finding Influential Observations.- 6.5.1 Leverage.- 6.5.2 Influential Observations.- 6.5.3 Collinearity-Influential Observations.- 6.6 Testing Model Assumptions.- 6.6.1 Preliminary Considerations.- 6.6.2 Testing for Heteroscedasticity.- 6.6.3 Testing for Autocorrelation.- 6.6.4 Testing for Non-Normality.- 6.6.5 Testing for Non-Linearity.- 6.6.6 Testing for Outlier.- 6.7 Problems.- Matrix Algebra.- A.1 Preliminaries.- A.1.1 Matrices and Vectors.- A.1.2 Elementary Operations.- A.1.3 Rank of a Matrix.- A.1.4 Subspaces and Matrices.- A.1.5 Partitioned Matrices.- A.1.6 Kronecker Product.- A.1.7 Moore-Penrose Inverse.- A.2 Common Pitfalls.- A.3 Square Matrices.- A.3.1 Specific Square Matrices.- A.3.2 Trace and Determinant.- A.3.3 Eigenvalue and Eigenvector.- A.3.4 Vector and Matrix Norm.- A.3.5 Definiteness.- A.4 Symmetric Matrix.- A.4.1 Eigenvalues.- A.4.2 Spectral Decomposition.- A.4.3 Rayleigh Ratio.- A.4.4 Definiteness.- A.5 Löwner Partial Ordering.- Stochastic Vectors.- B.1 Expectation and Covariance.- B.2 Multivariate Normal Distribution.- B.3 x2 Distribution.- B.4 F Distribution.- An Example Analysis with R.- C.1 Problem and Goal.- C.2 The Data.- C.3 The Choice of Variables.- C.3.1 The Full Model.- C.3.2 Stepwise Regression.- C.3.3 Collinearity Diagnostics.- C.4 Further Diagnostics.- C.4.1 Residuals.- C.4.2 Influential Observations.- C.5 Prediction.- References.
Servicehotline
089 - 70 80 99 47

Mo. - Fr. 8.00 - 20.00 Uhr
Sa. 10.00 - 20.00 Uhr
Filialhotline
089 - 30 75 75 75

Mo. - Sa. 9.00 - 20.00 Uhr
Bleiben Sie in Kontakt:
Sicher & bequem bezahlen:
akzeptierte Zahlungsarten: Überweisung, offene Rechnung,
Visa, Master Card, American Express, Paypal
Zustellung durch:
* Alle Preise verstehen sich inkl. der gesetzlichen MwSt. Informationen über den Versand und anfallende Versandkosten finden Sie hier.
** im Vergleich zum dargestellten Vergleichspreis.