《分层回归分析》由会员分享,可在线阅读,更多相关《分层回归分析(6页珍藏版)》请在金锄头文库上搜索。
1、分层回归分析 2007-12-08 14:55:16| 分类: 专业补充 | 标签: |字号大中小 订阅 Hierarchical Regression AnalysisIn a hierarchical multiple regression, the researcher decides not only how many predictors to enter but also the order in which they enter. Usually, the order of entry is based on logical or theoretical consideratio
2、ns. There are three predictor variables and one criterion variable in the following data set. A researcher decided the order of entry is X1, X2, and X3. SPSS for Windows 1. Enter Data. 2. Choose Analyze / Regression / Linear. Dependent: Select y and move it to the Dependent variable list. First, cli
3、ck on the variable y. Next, click on the right arrow. Block 1 of 1 Independent(s): Choose the first predictor variable x1 and move it to the Independent(s) box. Next, click the Next button as shown below. Block 2 of 2 Click the predictor variable x2 and move it to the Independent(s) box. Next, click
4、 the Next button as shown below. Block 3 of 3 Click the predictor variable x3 and move it to the Independent(s) box. 3. Click the Statistics button. Check R squared change. Click Continue and OK. SPSS Output 1. R square Change R Square and R Square ChangeOrder of EntryModel 1 : Enter X1Model 1: R sq
5、uare = .25The predictor X1 alone accounts for 25% of the variance in Y. R2 = .25 Model 2 : Enter X2 next.Model 2: R square = .582The Increase in R square: . 582 - .25 = .332The predictor X2 accounts for 33% of the variance in Y after controlling for X1. R2 = .25 + .332 = .582 Model Three: Enter X3 t
6、hirdModel 3: R square = .835The Increase in R square: . 835 - .582 = .253The predictor X3 accounts for 25% of the variance in Y, after X1 and X2 were partialed out from X3. R2 = .25 + .332 + .253 = .835 About 84% of the variance in the criterion variable was explained by the first (25%), second (33%
7、) and third (25%) predictor variables. 2. Adjusted R SquareFor our example, there are only five subjects. However, there are three predictors. Recall that R square may be overestimated when the data sets have few cases (n) relative to number of predictors (k). Data sets with a small sample size and
8、a large number of predictors will have a greater difference between the obtained and adjusted R square (.25 vs. .000, .582 vs. .165, and .835 vs. .338). 3. F Change and Sig. F ChangeIf the R square change associated with a predictor variable in question is large, it means that the predictor variable
9、 is a good predictor of the criterion variable.In the first step, enter the predictor variable x1 first. This resulted in an R square of .25, which was not statistically significant (F Change = 1.00, p .05). In the second step, we add x2. This increased the R square by 33%, which was not statistical
10、ly significant (F Change = 1.592, p .05). In the third step, we add x3. This increased the R square by an additional 25%, which was not statistically significant (F Change = 1.592, p .05).4. ANOVA TableModel1:About 25% (2.5/10 = .25) of the variance in the criterion variable (Y) can be accounted for
11、 by X1. The first model, which includes one predictor variable ( X1), resulted in an F ratio of 1.000 with a p .05.Model 2About 58% (5.82/10 = .58) of the variance in the criterion variable (Y) can be accounted for by X1 and X2. The second model, which includes two predictors (X1 and X2), resulted i
12、n an F ratio of 1.395 with a p .05.Model 3:About 84% (8.346/10 = .84) of the variance in the criterion variable (Y) can be accounted for by all three predictors (X1, X2 and X3). The third model, which includes all three predictors, resulted in an F ratio of 1.681 with a p .05. where k is the number of predictor variables and N is the sample size.