Psychology 202b

Advanced Psychological Statistics II

Spring Semester, 2011


Syllabus for Spring, 2011

Instructor:

Jack L. Vevea (jvevea@ucmerced.edu)
Classroom Building 360
Office hours will be complicated. Except as noted, the time will be 9:00 to 10:30. I will hold office hours on the following dates: Wednesday 1/26, 2/2 (from 8:30 to 9:45), 2/9; Thursday 2/17 (from 8:30 to 9:45); Wednesday 3/2; Thursday 3/10; Wednesday 3/16 and 3/23; Thursday 3/31; Wednesday 4/6 (from 10:30 to 12:00); Tuesday 4/12 from 10:30 to 12:00; Wednesday 4/20 from 10:30 to 12:00; Thursday 4/28; Wednesday 5/4; Wednesday 5/11 from 12:00 to 1:30; Thursday 5/12.

Telephone: (209) 228-4589

Text:

Required:
Keith, Timothy Z. (2006).
Multiple Regression and Beyond.
Boston: Pearson.

Meeting times:

We will meet Tuesdays and Thursdays from 3:00 to 4:15 P.M. in room 272 of the Classroom Building.

Course description:

Psychology 202b will focus on multiple regression and more complex models that subsume multiple regression: structural equation modeling and hierarchical linear regression.

Course learning goals:

In the class, you will:
  • learn basic linear algebra;
  • learn how to simulate data to illustrate statistical issues;
  • review simple and multiple linear regression;
  • learn about regression diagnostics and data transformations;
  • learn how to model and interpret interactions involving both categorical and continuous predictors;
  • investigate traditional approaches to sequential regression, including moderation and mediation;
  • learn about traditional and modern approaches to path analysis;
  • be introduced to exploratory and confirmatory factor analysis;
  • be introduced to structural equation modeling with latent variables;
  • be introduced to regression analysis with nested data structures;
  • learn basic approaches to analyzing longitudinal data;
  • Course learning outcomes:

    By the end of the class, you will be able to:
  • perform basic matrix manipulations (assessed on the first homework assignment and the midterm exam);
  • simulate regression data (assessed on the first homework assignment);
  • perform and interpret basic multiple regressions (assessed on the second homework assignment and the midterm exam);
  • execute and interpret hypothesis tests about single regression parameters and groups of regression parameters (assessed on the second homework assignment and the midterm exam);
  • execute and interpret regression diagnostic procedures (assessed on the second homework assignment and the midterm exam);
  • calculate and interpret confidence intervals for regression parameters, conditional means, and individual outcomes (assessed on the third homework assignment);
  • identify appropriate transformations for regression data (assessed on the third homework assignment);
  • execute and interpret sequential regression models (assessed on the fourth homework assignment and the midterm exam);
  • estimate and interpret interactions involving both categorical and continuous variables (assessed on the fifth homework assignment and the midterm exam);
  • investigate mediation and moderation relationships in regression (assessed on the fifth homework assignment and the finalexam);
  • perform and interpret traditional path analyses (assessed on the sixth homework assignment and the final exam);
  • interpret path analyses in the context of structural equation models (assessed on the sixth homework assignment and the final exam);
  • estimate and interpret exploratory and confirmatory factor analyses (assessed on the seventh homework assignment and the final exam);
  • interpret structural equation models with latent variables (assessed on the eighth homework assignment and the final exam);
  • execute and interpret simple regression models that involve nested data structures (assessed on the ninth homework assignment and the final exam);
  • execute and interpret basic analyses of longitudinal data (assessed on the ninth homework assignment and the final exam);
  • General comments on the purpose of the class:

    This class continues Psychology 202a, Advanced Psychological Statistics. The class focuses on the skill of thinking statistically, with emphasis on multiple linear regression and extensions of regression that add the capacity to consider latent variables and to deal with dependencies due to nested and longitudinal data structures.

    Prerequisites:

    Graduate status in Psychology or consent of the instructor. The class assumes that students have had prior exposure to statistics equivalent to an undergraduate introductory course and Psychology 202a (Advanced Psychological Statistics I).

    We will use the public domain statistical software known as R as well as the commercial product, SAS. If you lack prior experience with these programs, you should talk to your instructor about a crash course to catch up with the class. (We will also use free student versions of MPLUS and HLM software later in the class.)

    Evaluation:

    Grading will be based on a combination of nine written homework assignments, a midterm exam, and a comprehensive final exam. Homework will count for 40% of your final grade, and each exam will count for 30%.

    These components make up the final grade in the following manner. First, each component (homework, midterm exam, final exam) gets a grade point value: A+ = 4.3, A = 4.0, A- = 3.7, B+ = 3.3, B = 3.0, B- = 2.7, and so on. The weighted average of the grade points from the three components determines your final grade. The following table shows the mapping of grade point averages to letter grades:

    Grade Point Range Letter Grade
    GPA > 4.25 A+
    3.75 < GPA < 4.25 A
    3.50 < GPA < 3.75 A-
    3.25 < GPA < 3.50 B+
    2.75 < GPA < 3.25 B
    2.50 < GPA < 2.75 B-
    2.25 < GPA < 2.50 C+
    1.75 < GPA < 2.25 C
    1.50 < GPA < 1.75 C-
    0.75 < GPA < 1.50 D
    GPA < 0.75 F

    In the rare case where a student is precisely on the cusp between two letter grades, classroom participation determines whether the student receives the higher or lower grade.

    Academic Integrity

    Students should be familiar with University policies on academic honesty. You will find relevant information on the Student Judicial Affairs web page. In the overall context of that policy, the following information is specific to this class:

  • Cooperative work on the computational aspects of homework assignments is strongly encouraged, but you are expected to work independently on discussion and interpretation. The words you submit in your written assignments should be entirely your own.
  • You are expected to work entirely independently on exams. You will be allowed to use your books and notes during the exams. However, that policy exists to avoid the need for tedious memorization; you should not view access to your notes as a substitute for basic understanding of the material.

  • Course Outline

    January 18
    Initial class meeting: introduction, using the class web page, scheduling issues.

    Obtaining and using R.

    Introduction to matrix algebra. What is a matrix? Matrix multiplication. Diagonal and triangular matrices.

    January 20
    Matrices, continued. The identity matrix. The inverse matrix. Singularity, positive definite matrices. Matrices in R.
    January 25
    Review of simple linear regression; basic concepts, estimation principles. Simulating data for regression problems.
    Reading: Keith, Chapter One.
    January 27
    Introduction to multiple regression.
    Reading: Keith, Chapter Two.
    Homework assignment one is available; due February 3.
    February 1
    Multiple regression, continued. Multiple regression in matrix form. The issue of collinearity.
    Reading: Keith, Chapter Three.
    February 3
    Regression diagnostics. Influence, leverage, and outliers.
    Reading: Keith, pages 56-66.
    Homework assignment two is available; due February 15.
    February 8
    Confidence intervals for regression parameters. Confidence intervals for the conditional mean.
    Reading: Keith, pages 66-72.
    February 10
    Confidence intervals for individual predictions. Transformations.
    Reading: no new reading.
    February 15
    Transformations. The Box-Cox procedure.
    Reading: no new reading.
    Homework assignment three is available; due March 1.
    February 17
    Sequential regression.
    Reading: Keith, pages 74-90.
    February 22
    The evils of stepwise and all-possible-subsets regression. The legitimate uses of stepwise procedures.
    Reading: Keith, pages 92-102.
    February 24
    No class meeting. (Instructor out of town.)
    March 1
    Regression with categorical predictors.
    Reading: Keith, Chapter Six.
    March 3
    Mixing categorical and continuous predictors. Power analysis for regression.
    Reading: Keith, Chapter Seven.
    Homework assignment four is available; due March 10.
    March 8
    Interactions. Catching up.
    Reading: Keith, pages 161-167.
    March 10
    Review for the midterm exam.
    March 15
    Midterm exam.
    March 17
    Review of midterm exam. Moderation and mediation.
    Reading: Keith, pages 168-178.
    March 22
    No class: Spring recess.
    March 24
    No class: Spring recess.
    March 29
    Moderation and mediation.
    March 31
    Introduction to path analysis.
    Reading: Keith, Chapters Nine, Ten.
    April 5
    Introducing structural equation modeling. Path analysis with SEM.
    Reading: Keith, Chapters Eleven, Twelve.
    Homework assignment five is available; due April 12.
    April 7
    Exploratory factor analysis. Confirmatory factor analysis.
    Reading: Keith, Chapter Thirteen.
    Homework assignment six is available; due April 19.
    April 12
    CFA, continued.
    Reading: Keith, Chapter Fourteen.
    April 14
    No class meeting.
    April 19
    SEM with latent variables.
    Reading: Keith, Chapter Fifteen.
    April 21
    More complex latent variable models.
    Reading: Keith, Chapter Sixteen.
    April 26
    SEM, continuation and summary.
    Reading: Keith, Chapter Seventeen.
    April 28
    Regression analysis with nested data structures. Using HLM software.
    Reading: Supplemental, to be distributed.
    May 3
    A special case of nested data structures: longitudinal analysis with HLM.
    Reading: no new reading.
    Homework assignment seven is available; due May 13.
    May 5
    Summary and review for final exam.
    Reading: no new reading.
    May 13.
    Final exam, 3:00-6:00.