ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Using Multivariate Statistics (5th Edition)

دانلود کتاب استفاده از آمار چند متغیره (ویرایش پنجم)

Using Multivariate Statistics (5th Edition)

مشخصات کتاب

Using Multivariate Statistics (5th Edition)

دسته بندی: آمار ریاضی
ویرایش: 5 
نویسندگان: ,   
سری:  
ISBN (شابک) : 0205459382, 9780205459384 
ناشر:  
سال نشر: 2006 
تعداد صفحات: 1008 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 41 مگابایت 

قیمت کتاب (تومان) : 42,000



کلمات کلیدی مربوط به کتاب استفاده از آمار چند متغیره (ویرایش پنجم): ریاضیات، نظریه احتمالات و آمار ریاضی، آمار ریاضی، آمار ریاضی کاربردی



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 12


در صورت تبدیل فایل کتاب Using Multivariate Statistics (5th Edition) به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب استفاده از آمار چند متغیره (ویرایش پنجم) نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی در مورد کتاب استفاده از آمار چند متغیره (ویرایش پنجم)

راهنمای تکنیک های آماری: با استفاده از کتاب; بررسی آمارهای تک متغیره و دو متغیره; پاکسازی عمل: غربالگری داده ها قبل از تجزیه و تحلیل. رگرسیون چندگانه؛ تجزیه و تحلیل کوواریانس; تحلیل واریانس چند واریانس و کوواریانس؛ تجزیه و تحلیل مشخصات: رویکرد چند متغیره به اقدامات مکرر. تجزیه و تحلیل افتراقی؛ رگرسیون لجستیک؛ تجزیه و تحلیل بقا / شکست؛ همبستگی متعارف؛ مؤلفه های اصلی و تحلیل عاملی؛ مدل سازی معادلات ساختاری; مدل سازی چند سطحی؛ آنالیز فرکانس چند وجهی؛ مروری بر مدل خطی کلی. تجزیه و تحلیل سری آنها (به صورت آنلاین در www.ablongman.com/achnick5e موجود است). مقدمه ای مختصر بر جبر ماتریسی. طرح های تحقیق برای نمونه های کامل؛ جداول آماری


توضیحاتی درمورد کتاب به خارجی

A guide to statistical techniques: using the book; Review of univariate and bivariate statistics; Cleaning up uur act: screening data prior to analysis; Multiple regression; Analysis of covariance; Multivariance analysis of variance and covariance; Profile analysis: the multivariate approach to repeated measures; Discriminant analysis; Logistic regression; Survival / failure analysis; Canonical correlation; Principal components and factor analysis; Structural equation modeling; Multilevel modeling; Multiwy frequency anlaysis; Overview of the general linear model; Them-series analysis (available online at www.ablongman.com/achnick5e); A skimply introduction to matrix algebra; Research designs for complete examples; Statistical tables.



فهرست مطالب

1. Introduction.  1.1 Multivariate Statistics: Why?    1.1.1 The Domain of Multivariate Statistics: Numbers of IVs and DVs.    1.1.2 Experimental and Nonexperimental Research.    1.1.3 Computers and Multivariate Statistics.    1.1.4 Words of Caution.  1.2 Some Useful Definitions.    1.2.1 Continuous, Discrete, and Dichotomous Data.    1.2.2 Samples and Populations.    1.2.3 Descriptive and Inferential Statistics.    1.2.4 Orthogonality: Standard and Sequential Analyses.  1.3 Linear Combinations of Variables.  1.4 Number and Nature of Variables to Include.  1.5 Statistical Power.  1.6 Data Appropriate for Multivariate Statistics.    1.6.1 The Data Matrix.    1.6.2 The Correlation Matrix.    1.6.3 The Variance-Covariance Matrix.    1.6.4 The Sum-Of-Squares and Cross-Products Matrix.    1.6.5 Residuals.  1.7 Organization of the Book.  2. A Guide to Statistical Techniques: Using the Book.  2.1 Research Questions and Associated Techniques.    2.1.1 Degree of Relationship among Variables.    2.1.1.1 Bivariate r.    2.1.1.2 Multiple R.    2.1.1.3 Sequential R.    2.1.1.4 Canonical R.    2.1.1.5 Multiway Frequency Analysis.    2.1.1.6 Multilevel Modeling.  2.1.2 Significance of Group Differences.    2.1.2.1 One-Way ANOVA and t Test.    2.1.2.2 One-Way ANCOVA.    2.1.2.3 Factorial ANOVA.    2.1.2.4 Factorial ANCOVA.    2.1.2.5 Hotelling=s T(2).    2.1.2.6 One-Way MANOVA.    2.1.2.7 One-Way MANCOVA.    2.1.2.8 Factorial MANOVA.    2.1.2.9 Factorial MANCOVA.    2.1.2.10 Profile Analysis of Repeated Measures.  2.1.3 Prediction of Group Membership.    2.1.3.1 One-Way Discriminant.    2.1.3.2 Sequential One-Way Discriminant.    2.1.3.3 Multiway Frequency Analysis (Logit).    2.1.3.4 Logistic Regression.    2.1.3.5 Sequential Logistic Regression.    2.1.3.6 Factorial Discriminantanalysis.    2.1.3.7 Sequential Factorial Discriminant Analysis.  2.1.4 Structure.    2.1.4.1 Principal Components.    2.1.4.2 Factor Analysis.    2.1.4.3 Structural Equation Modeling.  2.1.5 Time Course of Events.    2.1.5.1 Survival/Failure Analysis.    2.1.5.2 Time-Series Analysis.  2.2 Some Further Comparisons.  2.3 A Decision Tree.  2.4 Technique Chapters.  2.5 Preliminary Check of the Data.  3. Review of Univariate and Bivariate Statistics.  3.1 Hypothesis Testing.    3.1.1 One-Sample z Test as Prototype.    3.1.2 Power.    3.1.3 Extensions of the Model.    3.1.4 Controversy Surrounding Significance Testing.  3.2 Analysis of Variance.    3.2.1 One-Way Between-Subjects ANOVA.    3.2.2 Factorial Between-Subjects ANOVA.    3.2.3 Within-Subjects ANOVA.    3.2.4 Mixed Between-Within-Subjects ANOVA.    3.2.5 Design Complexity.      3.2.5.1 Nesting.      3.2.5.2 Latin-Square Designs.      3.2.5.3 Unequal n and Nonorthogonality.      3.2.5.4 Fixed and Random Effects.    3.2.6 Specific Comparisons.      3.2.6.1 Weighting Coefficients for Comparisons.      3.2.6.2 Orthogonality of Weighting Coefficients.      3.2.6.3 Obtained F for Comparisons.      3.2.6.4 Critical F for Planned Comparisons.      3.2.6.5 Critical F for Post Hoc Comparisons.  3.3 Parameter Estimation.  3.4 Effect Size.  3.5 Bivariate Statistics: Correlation and Regression.    3.5.1 Correlation.    3.5.2 Regression.  3.6 Chi-Square Analysis.  4. Cleaning Up Your Act: Screening Data Prior to Analysis.  4.1 Important Issues in Data Screening.    4.1.1 Accuracy of Data File.    4.1.2 Honest Correlations.      4.1.2.1 Inflated Correlation.      4.1.2.2 Deflated Correlation.    4.1.3 Missing Data.      4.1.3.1 Deleting Cases or Variables.      4.1.3.2 Estimating Missing Data.      4.1.3.3 Using a Missing Data Correlation Matrix.      4.1.3.4 Treating Missing Data as Data.      4.1.3.5 Repeating Analyses with and Without Missing Data.      4.1.3.6 Choosing among Methods for Dealing with Missing Data.    4.1.4 Outliers.      4.1.4.1 Detecting Univariate and Multivariate Outliers.      4.1.4.2 Describing Outliers.      4.1.4.3 Reducing the Influence of Outliers.      4.1.4.4 Outliers in a Solution.    4.1.5 Normality, Linearity, and Homoscedasticity.      4.1.5.1 Normality.      4.1.5.2 Linearity.      4.1.5.3 Homoscedasticity, Homogeneity of Variance, and Homogeneity of      Variance-Covariance Matrices.    4.1.6 Common Data Transformations.    4.1.7 Multicollinearity and Singularity.    4.1.8 A Checklist and Some Practical Recommendation.  4.2 Complete Examples of Data Screening.    4.2.1 Screening Ungrouped Data.      4.2.1.1 Accuracy of Input, Missing Data, Distributions, and Univariate Outliers.      4.2.1.2 Linearity and Homoscedasticity.      4.2.1.3 Transformation.      4.2.1.4 Detecting Multivariate Outliers.      4.2.1.5 Variables Causing Cases to Be Outliers.      4.2.1.6 Multicollinearity.    4.2.2 Screening Grouped Data.      4.2.2.1 Accuracy of Input, Missing Data, Distributions, Homogeneity of Variance, and Univariate Outliers.      4.2.2.2 Linearity.      4.2.2.3 Multivariate Outliers.      4.2.2.4 Variables Causing Cases to Be Outliers.      4.2.2.5 Multicollinearity.  5. Multiple Regression.  5.1 General Purpose and Description.  5.2 Kinds of Research Questions.    5.2.1 Degree of Relationship.    5.2.2 Importance of IVs.    5.2.3 Adding IVs.    5.2.4 Changing IVs.    5.2.5 Contingencies among IVs.    5.2.6 Comparing Sets of IVs.    5.2.7 Predicting DV Scores for Members of a New Sample.    5.2.8 Parameter Estimates.  5.3 Limitations to Regression Analyses.    5.3.1 Theoretical Issues.    5.3.2 Practical Issues.      5.3.2.1 Ratio of Cases to IVs.      5.3.2.2 Absence of Outliers among the IVs and on the DV.      5.3.2.3 Absence of Multicollinearity and Singularity.      5.3.2.4 Normality, Linearity, Homoscedasticity of Residuals.      5.3.2.5 Independence of Errors.      5.3.2.6 Outliers in the Solution.  5.4 Fundamental Equations for Multiple Regression.    5.4.1 General Linear Equations.    5.4.2 Matrix Equations.    5.4.3 Computer Analyses of Small-Sample Example.  5.5 Major Types of Multiple Regression.    5.5.1 Standard Multiple Regression.    5.5.2 Sequential Multiple Regression.    5.5.3 Statistical (Stepwise) Regression.    5.5.4 Choosing among Regression Strategies.  5.6 Some Important Issues.    5.6.1 Importance of IVs.      5.6.1.1 Standard Multiple Regression.      5.6.1.2 Sequential or Statistical Regression.    5.6.2 Statistical Inference.      5.6.2.1 Test for Multiple R.      5.6.2.2 Test of Regression Components.      5.6.2.3 Test of Added Subset of IVs.      5.6.2.4 Confidence Limits around B.      5.6.2.5 Comparing Two Sets of Predictors.    5.6.3 Adjustment of R2.    5.6.4 Suppressor Variables.    5.6.5 Regression Approach to ANOVA.    5.6.6 Centering When Interactions and Powers of IVs Are Included.    5.6.7 Mediation in Causal Sequences.  5.7 Complete Examples of Regression Analysis.    5.7.1 Evaluation of Assumptions.      5.7.1.1 Ratio of Cases to IVs.      5.7.1.2 Normality, Linearity, Homoscedasticity, and Independence of Residuals.      5.7.1.3 Outliers.      5.7.1.4 Multicollinearity and Singularity.    5.7.2 Standard Multiple Regression.    5.7.3 Sequential Regression.  5.8 Comparison of Programs.    5.8.1 SPSS Package.    5.8.2 SAS System.    5.8.3 SYSTAT System.  6. Analysis of Covariance.  6.1 General Purpose and Description.  6.2 Kinds of Research Questions.    6.2.1 Main Effects of IVs.    6.2.2 Interactions among IVs.    6.2.3 Specific Comparisons and Trend Analysis.    6.2.4 Effects of Covariates.    6.2.5 Effect Size.    6.2.6 Parameter Estimates.  6.3 Limitations to Analysis of Covariance.    6.3.1 Theoretical Issues.    6.3.2 Practical Issues.      6.3.2.1 Unequal Sample Sizes, Missing Data, and Ratio of Cases to IVs.      8.3.2.2 Absence of Outliers.      6.3.2.3 Absence of Multicollinearity and Singularity.      6.3.2.4 Normality of Sampling Distributions.      6.3.2.5 Homogeneity of Variance.      6.3.2.6 Linearity.      6.3.2.7 Homogeneity of Regression.      6.3.2.8 Reliability of Covariates.  6.4 Fundamental Equations for Analysis of Covariance.    6.4.1 Sums of Squares and Cross Products.    6.4.2 Significance Test and Effect Size.    6.4.3 Computer Analyses of Small-Sample Example.  6.5 Some Important Issues.    6.5.1 Choosing Covariates.    6.5.2 Evaluation of Covariates.    6.5.3 Test for Homogeneity of Regression.    6.5.4 Design Complexity.      6.5.4.1 Within-Subjects and Mixed Within-Between Designs.        6.5.4.1.1 Same Covariate(S) for All Cells.        6.5.4.1.2 Varying Covariate(S) Over Cells.      6.5.4.2 Unequal Sample Sizes.      6.5.4.3 Specific Comparisons and Trend Analysis.      6.5.4.4 Effect Size.    6.5.5 Alternatives to ANCOVA.  6.6 Complete Example of Analysis of Covariance.    6.6.1 Evaluation of Assumptions.      6.6.1.1 Unequal n and Missing Data.      6.6.1.2 Normality.      6.6.1.3 Linearity.      6.6.1.4 Outliers.      6.6.1.5 Multicollinearity and Singularity.      6.6.1.6 Homogeneity of Variance.      6.6.1.7 Homogeneity of Regression.      6.6.1.8 Reliability of Covariates.    6.6.2 Analysis of Covariance.      6.6.2.1 Main Analysis.      6.6.2.2 Evaluation of Covariates.      6.6.2.3 Homogeneity of Regression Run.  6.7 Comparison of Programs.    6.7.1 SPSS Package.    6.7.2 SAS System.    6.7.3 SYSTAT System.  7. Multivariate Analysis of Variance and Covariance.  7.1 General Purpose and Description.  7.2 Kinds of Research Questions.    7.2.1 Main Effects of IVs.    7.2.2 Interactions among IVs.    7.2.3 Importance of DVs.    7.2.4 Parameter Estimates.    7.2.5 Specific Comparisons and Trend Analysis.    7.2.6 Effect Size.    7.2.7 Effects of Covariates.  7.3 Limitations to Multivariate Analysis of Variance and Covariance.    7.3.1 Theoretical Issues.    7.3.2 Practical Issues.      7.3.2.1 Unequal Sample Sizes, Missing Data, and Power.      7.3.2.2 Multivariate Normality.      7.3.2.3 Absence of Outliers.      7.3.2.4 Homogeneity of Variance-Covariance Matrices.      7.3.2.5 Linearity.      7.3.2.6 Homogeneity of Regression.      7.3.2.7 Reliability of Covariates.      7.3.2.8 Absence of Multicollinearity and Singularity.  7.4 Fundamental Equations for Multivariate Analysis of Variance and Covariance.    7.4.1 Multivariate Analysis of Variance.    7.4.2 Computer Analyses of Small-Sample Example.    7.4.3 Multivariate Analysis of Covariance.  7.5 Some Important Issues.    7.5.1 MANOVA vs. ANOVAs.    7.5.2 Criteria for Statistical Inference.    7.5.3 Assessing DVs.      7.5.3.1 Univariate F.      7.5.3.2 Roy-Bargmann Stepdown Analysis.      7.5.3.3 Using Discriminant Function Analysis.      7.5.3.4 Choosing among Strategies for Assessing DVs.    7.5.4 Specific Comparisons and Trend Analysis.    7.5.5 Design Complexity.      7.5.5.1 Within-Subjects and Between-Within Designs.      7.5.5.2 Unequal Sample Sizes.  7.6 Complete Examples of Multivariate Analysis of Variance and Covariance.    7.6.1 Evaluation of Assumptions.      7.6.1.1 Unequal Sample Sizes and Missing Data.      7.6.1.2 Multivariate Normality.      7.6.1.3 Linearity.      7.6.1.4 Outliers.      7.6.1.5 Homogeneity of Variance-Covariance Matrices.      7.6.1.6 Homogeneity of Regression.      7.6.1.7 Reliability of Covariates.      7.6.1.8 Multicollinearity and Singularity.    7.6.2 Multivariate Analysis of Variance.    7.6.3 Multivariate Analysis of Covariance.  7.7 Comparison of Programs.    7.7.1 SPSS Package.    7.7.2 SAS System.    7.7.3 SYSTAT System.  8. Profile Analysis: The Multivariate Approach to Repeated Measures.  8.1 General Purpose and Description.  8.2 Kinds of Research Questions.    8.2.1 Parallelism of Profiles.    8.2.2 Overall Difference among Groups.    8.2.3 Flatness of Profiles.    8.2.4 Contrasts Following Profile Analysis.    8.2.5 Parameter Estimates.    8.2.6 Effect Size.  8.3 Limitations to Profile Analysis.    8.3.1 Theoretical Issues.    8.3.2 Practical Issues.      8.3.2.1 Sample Size, Missing Data, and Power.      8.3.2.2 Multivariate Normality.      8.3.2.3 Absence of Outliers.      8.3.2.4 Homogeneity of Variance-Covariance Matrices.      8.3.2.5 Linearity.      8.3.2.6 Absence of Multicollinearity and Singularity.  8.4 Fundamental Equations for Profile Analysis.    8.4.1 Parallelism.    8.4.2 Flatness.    8.4.3 Computer Analyses of Small-Sample Example.  8.5 Some Important Issues.    8.5.1 Univariate Vs. Multivariate Approach to Repeated Measures.    8.5.2 Contrasts in Profile Analysis.      8.5.2.1 Parallelism and Flatness Significant, Levels Not Significant (Simple-Effects Analysis).      8.5.2.2 Parallelism and Levels Significant, Flatness Not Significant (Simple-Effects Analysis).      8.5.2.3parallelism, Levels, and Flatness Significant (Interaction Contrasts).      8.5.2.4 Only Parallelism Significant.    8.5.3 Doubly-Multivariate Designs.    8.5.4 Classifying Profiles.    8.5.5 Imputation of Missing Values.  8.6 Complete Examples of Profile Analysis.    8.6.1 Profile Analysis of Subscales of the Wisc.      8.6.1.1 Evaluation of Assumptions.        8.6.1.1.1 Unequal Sample Sizes and Missing Data.        8.6.1.1.2 Multivariate Normality.        8.6.1.1.3 Linearity.        8.6.1.1.4 Outliers.        8.6.1.1.5 Homogeneity of Variance-Covariance Matrices.        8.6.1.1.6 Multicollinearity and Singularity.      8.6.1.2 Profile Analysis.    8.6.2 Doubly-Multivariate Analysis of Reaction Time.      8.6.2.1 Evaluation of Assumptions.        8.6.2.1.1 Unequal Sample Sizes, Missing Data, Multivariate Normality, and Linearity.        8.6.2.1.2 Outliers.        8.6.2.1.3 Homogeneity of Variance-Covariance Matrices.        8.6.2.1.4 Homogeneity of Regression.        8.6.2.1.5 Reliability of DVs.        8.6.2.1.6 Multicollinearity and Singularity.      8.6.2.2 Doubly-Multivariate Analysis of Slope and Intercept.  8.7 Comparison of Programs.    8.7.1 SPSS Package.    8.7.2 SAS System.    8.7.3 SYSTAT System.  9. Discriminant Analysis.  9.1 General Purpose and Description.  9.2 Kinds of Research Questions.    9.2.1 Significance of Prediction.    9.2.2 Number of Significant Discriminant Functions.    9.2.3 Dimensions of Discrimination.    9.2.4 Classification Functions.    9.2.5 Adequacy of Classification.    9.2.6 Effect Size.    9.2.7 Importance of Predictor Variables.    9.2.8 Significance of Prediction with Covariates.    9.2.9 Estimation of Group Means.  9.3 Limitations to Discriminant Analysis.    9.3.1 Theoretical Issues.    9.3.2 Practical Issues.      9.3.2.1 Unequal Sample Sizes, Missing Data, and Power.      9.3.2.2 Multivariate Normality.      9.3.2.3 Absence of Outliers.      9.3.2.4 Homogeneity of Variance-Covariance Matrices.      9.3.2.5 Linearity.      9.3.2.6 Absence of Multicollinearity and Singularity.  9.4 Fundamental Equations for Discriminantanalysis.    9.4.1 Derivation and Test of Discriminant Functions.    9.4.2 Classification.    9.4.3 Computer Analyses of Small-Sample Example.  9.5 Types of Discriminant Analysis.    9.5.1 Direct Discriminant Analysis.    9.5.2 Sequential Discriminant Analysis.    9.5.3 Stepwise (Statistical) Discriminant Analysis.  9.6 Some Important Issues.    9.6.1 Statistical Inference.      9.6.1.1 Criteria for Overall Statistical Significance.      9.6.1.2 Stepping Methods.    9.6.2 Number of Discriminant Functions.    9.6.3 Interpreting Discriminant Functions.      9.6.3.1 Discriminant Function Plots.      9.6.3.2 Loading Matrices.    9.6.4 Evaluating Predictor Variables.    9.6.5 Effect Size.    9.6.6 Design Complexity: Factorial Designs.    9.6.7 Use of Classification Procedures.      9.6.7.1 Cross-Validation and New Cases.      9.6.7.2 Jackknifed Classification.      9.6.7.3 Evaluating Improvement in Classification.  9.7 Comparison of Programs.    9.7.1 SPSS Package.    9.7.2 SAS System.    9.7.3 SYSTAT System.  10. Logistic Regression.  10.1 General Purpose and Description.  10.2 Kinds of Research Questions.    10.2.1 Prediction of Group Membership or Outcome.    10.2.2 Importance of Predictors.    10.2.3 Interactions among Predictors.    10.2.4 Parameter Estimates.    10.2.5 Classification of Cases.    10.2.6 Significance of Prediction with Covariates.    10.2.7 Effect Size.  10.3 Limitations to Logistic Regression Analysis.    10.3.1 Theoretical Issues.    10.3.2 Practical Issues.      10.3.2.1 Ratio of Cases to Variables.      10.3.2.2 Adequacy of Expected Frequencies and Power.      10.3.2.3 Linearity in the Logit.      10.3.2.4 Absence of Multicollinearity.      10.3.2.5 Absence of Outliers in the Solution.  10.4 Fundamental Equations for Logistic Regression.    10.4.1 Testing and Interpreting Coefficients.    10.4.2 Goodness-of-Fit.    10.4.3 Comparing Models.    10.4.4 Interpretation and Analysis of Residuals.    10.4.5 Computer Analyses of Small-Sample Example.  10.5 Types of Logistic Regression.    10.5.1 Direct Logistic Regression.    10.5.2 Sequential Logistic Regression.    10.5.3 Statistical (Stepwise) Logistic Regression.    10.5.4 Probit and Other Analyses.  10.6 Some Important Issues.    10.6.1 Statistical Inference.    10.6.1.1 Assessing Goodness-Of-Fit of Models.      10.6.1.1.1 Constant-Only Vs. Full Model.      10.5.1.1.2 Comparison with a Perfect (Hypothetical) Model.      10.6.1.1.3 Deciles of Risk.    10.6.1.2 Tests of Individual Variables.    10.6.2 Effect Size for a Model.    10.6.3 Interpretation of Coefficients Using Odds.    10.6.4 Coding Outcome and Predictor Categories.    10.6.5 Number and Type of Outcome Categories.    10.6.6 Classification of Cases.    10.6.7 Hierarchical and Nonhierarchical Analysis.    10.6.8 Importance of Predictors.    10.6.9 Logistic Regression for Matched Groups.  10.7 Complete Examples of Logistic Regression.    10.7.1 Evaluation of Limitations.      10.7.1.1 Ratio of Cases to Variables and Missing Data.      10.7.1.2 Multicollinearity.      10.7.1.3 Outliers in the Solution.    10.7.2 Direct Logistic Regression with Two-Category Outcome and Continuous Predictors.    10.7.3 Sequential Logistic Regression with Three Categories of Outcome.      10.7.3.1 Limitations of Multinomial Logistic Regression.        10.7.3.1.1 Adequacy of Expected Frequencies.        10.7.3.1.2 Linearity in the Logit.        10.7.3.2 Sequential Multinomial Logistic Regression.  10.8 Comparison of Programs.    10.8.1 SPSS Package.    10.8.2 SAS System.    10.8.3 SYSTAT System.  11. Survival/Failure Analysis.  11.1 General Purpose and Description.  11.2 Kinds of Research Questions.    11.2.1 Proportions Surviving at Various Times.    11.2.2 Group Differences in Survival.    11.2.3 Survival Time with Covariates.      11.2.3.1 Treatment Effects.      11.2.3.2 Importance of Covariates.      11.2.3.3 Parameter Estimates.      11.2.3.4 Contingencies among Covariates.      11.2.3.5 Effect Size and Power.  11.3 Limitations to Survival Analysis.    11.3.1 Theoretical Issues.    11.3.2 Practical Issues.      11.3.2.1 Sample Size and Missing Data.      11.3.2.2 Normality of Sampling Distributions, Linearity, and Homoscedasticity.      11.3.2.3 Absence of Outliers.      11.3.2.4 Differences Between Withdrawn and Remaining Cases.      11.3.2.5 Change in Survival Conditions Over Time.      11.3.2.6 Proportionality of Hazards.      11.3.2.7 Absence of Multicollinearity.  11.4 Fundamental Equations for Survival Analysis.    11.4.1 Life Tables.    11.4.2 Standard Error of Cumulative Proportion Surviving.    11.4.3 Hazard and Density Functions.    11.4.4 Plot of Life Tables.    11.4.5 Test for Group Differences.    11.4.6 Computer Analyses of Small-Sample Example.  11.5 Types of Survival Analysis.    11.5.1 Actuarial and Product-Limit Life Tables and Survivor Functions.    11.5.2 Prediction of Group Survival Times from Covariates.      11.5.2.1 Direct, Sequential, and Statistical Analysis.      11.5.2.2 Cox Proportional-Hazards Model.      11.5.2.3 Accelerated Failure-Time Model.      11.5.2.4 Choosing a Method.  11.6 Some Important Issues.    11.6.1 Proportionality of Hazards.    11.6.2 Censored Data.      11.6.2.1 Right-Censored Data.      11.6.2.2 Other Forms of Censoring.    11.6.3 Effect Size and Power.    11.6.4 Statistical Criteria.      11.6.4.1 Test Statistics for Group Differences in Survival Functions.      11.6.4.2 Test Statistics for Prediction from Covariates.    11.6.5 Predicting Survival Rate.      11.6.5.1 Regression Coefficients (Parameter Estimates).      11.6.5.2 Odds Ratios.      11.6.5.3 Expected Survival Rates.  11.7 Complete Example of Survival Analysis.    11.7.1 Evaluation of Assumptions.      11.7.1.1 Accuracy of Input, Adequacy of Sample Size, Missing Data, and Distributions.      11.7.1.2 Outliers.      11.7.1.3 Differences Between Withdrawn and Remaining Cases.      11.7.1.4 Change in Survival Experience Over Time.      11.7.1.5 Proportionality of Hazards.      11.7.1.6 Multicollinearity.    11.7.2 Cox Regression Survival Analysis.      11.7.2.1 Effect of Drug Treatment.      11.7.2.2 Evaluation of Other Covariates.  11.8 Comparison of Programs.    11.8.1 SAS System.    11.8.2 SPSS Package.    11.8.3 SYSTAT System.  12. Canonical Correlation.  12.1 General Purpose and Description.  12.2 Kinds of Research Questions.    12.2.1 Number of Canonical Variate Pairs.    12.2.2 Interpretation of Canonical Variates.    12.2.3 Importance of Canonical Variates.    12.2.4 Canonical Variate Scores.  12.3 Limitations.    12.3.1 Theoretical Limitations.    12.3.2 Practical Issues.      12.3.2.1 Ratio of Cases to IVs.      12.3.2.2 Normality, Linearity, and Homoscedasticity.      12.3.2.3 Missing Data.      12.3.2.4 Absence of Outliers.      12.3.2.5 Absence of Multicollinearity and Singularity.  12.4 Fundamental Equations for Canonical Correlation\.    12.4.1 Eigenvalues and Eigenvectors.    12.4.2 Matrix Equations.    12.4.3 Proportions of Variance Extracted.    12.4.4 Computer Analyses of Small-Sample Example.  12.5 Some Important Issues.    12.5.1 Importance of Canonical Variates.    12.5.2 Interpretation of Canonical Variates.  12.6 Complete Example of Canonical Correlation.    12.6.1 Evaluation of Assumptions.      12.6.1.1 Missing Data.      12.6.1.2 Normality, Linearity, and Homoscedasticity.      12.6.1.3 Outliers.      12.6.1.4 Multicollinearity and Singularity.    12.6.2 Canonical Correlation.  12.7 Comparison of Programs.    12.7.1 SAS System.    12.7.2 SPSS Package.    12.7.3 SYSTAT System.  13. Principal Components and Factor Analysis.  13.1 General Purpose and Description.  13.2 Kinds of Research Questions.    13.2.1 Number of Factors.    13.2.2 Nature of Factors.    13.2.3 Importance of Solutions and Factors.    13.2.4 Testing Theory in FA.    13.2.5 Estimating Scores on Factors.  13.3 Limitations.    13.3.1 Theoretical Issues.    13.3.2 Practical Issues.      13.3.2.1 Sample Size and Missing Data.      13.3.2.2 Normality.      13.3.2.3 Linearity.      13.3.2.4 Absence of Outliers among Cases.      13.3.2.5 Absence of Multicollinearity and Singularity.      13.3.2.6 Factorability of R.      13.3.2.7 Absence of Outliers among Variables.  13.4 Fundamental Equations for Factor Analysis.    13.4.1 Extraction.    13.4.2 Orthogonal Rotation.    13.4.3 Communalities, Variance, and Covariance.    13.4.4 Factor Scores.    13.4.5 Oblique Rotation.    13.4.6 Computer Analyses of Small-Sample Example.  13.5 Major Types of Factor Analysis.    13.5.1 Factor Extraction Techniques.      13.5.1.1 PCA vs. FA.      13.5.1.2 Principal Components.      13.5.1.3 Principal Factors.      13.5.1.4 Image Factor Extraction.      13.5.1.5 Maximum Likelihood Factor Extraction.      13.5.1.6 Unweighted Least Squares Factoring.      13.5.1.7 Generalized (Weighted) Least Squares Factoring.      13.5.1.8 Alpha Factoring.    13.5.2 Rotation.      13.5.2.1 Orthogonal Rotation.      13.5.2.2 Oblique Rotation.      13.5.2.3 Geometric Interpretation.    13.5.3 Some Practical Recommendations.  13.6 Some Important Issues.    13.6.1 Estimates of Communalities.    13.6.2 Adequacy of Extraction and Number of Factors.    13.6.3 Adequacy of Rotation and Simple Structure.    13.6.4 Importance and Internal Consistency of Factors.    13.6.5 Interpretation of Factors.    13.6.6 Factor Scores.    13.6.7 Comparisons among Solutions and Groups.  13.7 Complete Example of FA.    13.7.1 Evaluation of Limitations.      13.7.1.1 Sample Size and Missing Data.      13.7.1.2 Normality.      13.7.1.3 Linearity.      13.7.1.4 Multicollinearity and Singularity.      13.7.1.5 Outliers among Variables.    13.7.2 Principal Factors Extraction with Varimax Rotation.  13.8 Comparison of Programs.    13.8.1 SPSS Package.    13.8.2 SAS System.    13.8.3 SYSTAT System.  14. Structural Equation Modeling.  General Purpose and Description.  Kinds of Research Questions.  Adequacy of the Model.   Testing Theory.   Amount of Variance in the Variables Accounted for by the Factors.   Reliability of the Indicators.   Parameter Estimates.   14.2.6 Indirect Effects.  Group Differences.   Longitudinal Differences.   Multilevel Modeling.  Limitations to Structural Equation Modeling.  Theoretical Issues.   Practical Issues.  Fundamental Equations for Structural Equations Modeling.  Covariance Algebra.   Model Hypotheses.   Model Specification.   Model Estimation.   Model Evaluation.   Computer Analysis of Small Sample Example.  Some Important Issues.  Model Identification.   Estimation Techniques.   Assessing the Fit of the Model.   Model Modification.   Reliability and Proportion of Variance.   Discrete and Ordinal Data.   Multiple Group Models.   Mean and Covariance Structure Models.  Complete Examples of Structural Equation Modeling Analysis.  Model Specification for CFA.   Evaluation of Assumptions for CFA.   Model Modification.   SEM Model Specification.   SEM Model Estimation and Preliminary Evaluation.   Model Modification.  Comparison of Programs.  EQS.   LISREL.   SAS System.   AMOS.  15. Multilevel Linear Modeling.  15.1 General Purpose and Description.  15.2 Kinds of Research Questions.    15.2.1 Group Differences in Means.    15.2.2 Group Differences in Slopes.    15.2.3 Cross-Level Interactions.    15.2.4 Meta-Analysis.    15.2.5 Relative Strength of Predictors at Various Levels.    15.2.6 Individual and Group Structure.    15.2.7 Path Analysis at Individual and Group Levels.    15.2.8 Analysis of Longitudinal Data.    15.2.9 Multilevel Logistic Regression.    15.2.10 Multiple Response Analysis.  15.3 Limitations to Multilevel Linear Modeling.    15.3.1 Theoretical Issues.    15.3.2 Practical Issues.      15.3.2.1 Sample Size, Unequal-N, and Missing Data.      15.3.2.2 Independence of Errors.      15.3.2.3 Absence of Multicollinearity and Singularity.  15.4 Fundamental Equations.    15.4.1 Intercepts-Only Model.      15.4.1.1 The Intercepts-Only Model: Level-1 Equation.      15.4.1.2 The Intercepts-Only Model: Level 2 Equation.      15.4.1.3 Computer Analyses of Intercepts-Only Model.    15.4.2 Model with a First-Level Predictor.      15.4.2.1 Level-1 Equation for a Model with Level-1 Predictor.      15.4.2.2 Level-2 Equations for a Model with a Level-1 Predictor.      15.4.2.3 Computer Analyses of Model with a Level-1 Predictor.    15.4.3 Model with Predictors at First and Second Levels.      15.4.3.1 Level-1 Equation for Model with Predictors at Both Levels.      15.4.3.2 Level-2 Equations for Model with Predictors at Both Levels.      15.4.3.3 Computer Analyses of Model with Predictors at First and Second Levels.  15.5 Types of Mlm.    15.5.1 Repeated Measures.    15.5.2 Higher-Order Mlm.    15.5.3 Latent Variables.    15.5.4 Nonnormal Outcome Variables.    15.5.5 Multiple Response Models.  15.6 Some Important Issues.    15.6.1 Intraclass Correlation.    15.6.2 Centering Predictors and Changes in Their Interpretation.    15.6.3 Interactions.    15.6.4 Random and Fixed Intercepts and Slopes.    15.6.5 Statistical Inference.      15.6.5.1 Assessing Models.      15.6.5.2 Tests of Individual Effects.    15.6.6 Effect Size.    15.6.7 Estimation Techniques and Convergence Problems.    15.6.8 Exploratory Model Building.  15.7 Complete Example of MLM.    15.7.1 Evaluation of Assumptions.      15.7.1.1 Sample Sizes, Missing Data, and Distributions.      15.7.1.2 Outliers.      15.7.1.3 Multicollinearity and Singularity.      15.7.1.4 Independence of Errors: Intraclass Correlations.    15.7.2 Multilevel Modeling.  15.8 Comparison of Programs.    15.8.1 SAS System.    15.8.2 SPSS Package.    15.8.3 HLM Program.    15.8.4 MLwiN Program.    15.8.5 SYSTAT System.  16. Multiway Frequency Analysis.  16.1 General Purpose and Description.  16.2 Kinds of Research Questions.    16.2.1 Associations among Variables.    16.2.2 Effect on a Dependent Variable.    16.2.3 Parameter Estimates.    16.2.4 Importance of Effects.    16.2.5 Effect Size.    16.2.6 Specific Comparisons and Trend Analysis.  16.3 Limitations to Multiway Frequency Analysis.    16.3.1 Theoretical Issues.    16.3.2 Practical Issues.      16.3.2.1 Independence.      16.3.2.2 Ratio of Cases to Variables.      16.3.2.3 Adequacy of Expected Frequencies.      16.3.2.4 Absence of Outliers in the Solution.  16.4 Fundamental Equations for Multiway Frequency Analysis.    16.4.1 Screening for Effects.      16.4.1.1 Total Effect.      16.4.1.2 First-Order Effects.      16.4.1.3 Second-Order Effects.      16.4.1.4 Third-Order Effect.    16.4.2 Modeling.    16.4.3 Evaluation and Interpretation.      16.4.3.1 Residuals.      16.4.3.2 Parameter Estimates.    16.4.4 Computer Analyses of Small-Sample Example.  16.5 Some Important Issues.    16.5.1 Hierarchical and Nonhierarchical Models.    16.5.2 Statistical Criteria.      16.5.2.1 Tests of Models.      16.5.2.2 Tests of Individual Effects.    16.5.3 Strategies for Choosing a Model.      16.5.3.1 SPSS HILOGLINEAR (Hierarchical).      16.5.3.2 SPSS GENLOG (General Log-Linear).      16.5.3.3 SAS CATMOD and SPSS LOGLINEAR (General Log-Linear).  16.6 Complete Example of Multiway Frequency Analysis.    16.6.1 Evaluation of Assumptions: Adequacy of Expected Frequencies.    16.6.2 Hierarchical Log-Linear Analysis.      16.6.2.1 Preliminary Model Screening.      16.6.2.2 Stepwise Model Selection.      16.6.2.3 Adequacy of Fit.      16.6.2.4 Interpretation of the Selected Model.  16.7 Comparison of Programs.    16.7.1 SPSS Package.    16.7.2 SAS System.    16.7.3 SYSTAT System.  Appendix A: A Skimpy Introduction to Matrix Algebra.  A.1 The Trace of a Matrix.  A.2 Addition or Subtraction of a Constant to a Matrix.  A.3 Multiplication or Division of a Matrix By a Constant.  A.4 Addition and Subtraction of Two Matrices.  A.5 Multiplication, Transposes, and Square Roots of Matrices.  A.6 Matrix "Division" (Inverses and Determinants).  A.7 Eigenvalues and Eigenvectors: Procedures for Consolidating Variance from a Matrix.  Appendix B: Research Designs for Complete Examples.  B.1 Women's Health and Drug Study.  B.2 Sexual Attraction Study.  B.3 Learning Disabilities Data Bank.  B.4 Reaction Time to Identify Figures.  B.5 Clinical Trial for Primary Biliary Cirrhosis.  Appendix C: Statistical Tables.  C.1 Normal Curve Areas.  C.2 Critical Values of the T Distribution for   =.05 and.-1, Two-Tailed Test.  C.3 Critical Values of the F Distribution.  C.4 Critical Values of Chi Square ( 5).  C.5 Critical Values for Squares Multiple Correlation (R5) in Forward Stepwise Select.  C.6 Critical Values for Fmax (S5max/S5min) Distribution for   =.05 and.01.




نظرات کاربران