Discovering statistics using IBM SPSS statistics by Andy Field, fifth edition – Summary chapter 14

Factorial designs are used when there are more than one independent variables. There are several factorial designs:

  1. Independent factorial design (between groups)
    There are several independent variables measured using different entities.
  2. Repeated-measured (related) factorial design
    There are several independent variables using the same entities in all conditions.
  3. Mixed design
    There are several independent variables. Some conditions use the same entities and some conditions use different entities.

INDEPENDENT FACTORIAL DESIGNS AND THE LINEAR MODEL
The calculation of factorial designs is similar to that of ANOVA, but the explained variance (between-groups variance) consists of more than one independent variable. The model sum of squares (between-groups variance) consists of the variance due to the first variable, the variance due to the second variable and the variance due to the interaction between the first and the second variable.

It uses the following formula:

This is the model sum of squares and shows you how much variance the independent variables explain. It can be useful to see how much of the total variance each independent variable explains. This can be done by using the same formula, but then only for one independent variable. In order to achieve this, the independent variable has to be grouped together in one group (this normally increases the n, as more multiple groups are being put together in one big group).

The residual sum of squares, the error variance (SSR) shows how much variance cannot be explained by the independent variables. It uses the following formula:

It is the variance of a group times the number of participants in the group minus one for each group added together. The degrees of freedom are added up together too. In a two-way design, the F-statistic is computed for the two main effects and the interaction.

OUTPUT FROM FACTORIAL DESIGNS
A main effect should not be interpreted in the presence of a significant interaction involving that main effect. In other words, main effects don’t need to be interpreted if an interaction effect involving that variable is significant.

Simple effects analysis looks at the effect of one independent variable at individual levels of the other independent variable. When judging interaction graphs, there are two general rules:

  1. Non-parallel lines on an interaction graph indicate some degree of interaction, but how strong and whether the interaction is significant depends on how non-parallel the lines are.
  2. Lines on an interaction graph that cross are very non-parallel, which hints at a possible significant interaction, but does not necessarily mean that it is a significant interaction.

Image

Access: 
Public

Image

Join WorldSupporter!
This content is used in:

Scientific & Statistical Reasoning – Summary interim exam 2 (UNIVERSITY OF AMSTERDAM)

Summary of Discovering statistics using IBM SPSS statistics by Andy Field - 5th edition

Image

 

 

Contributions: posts

Help other WorldSupporters with additions, improvements and tips

Image

Spotlight: topics

Check the related and most recent topics and summaries:
Activities abroad, study fields and working areas:
Institutions, jobs and organizations:
This content is also used in .....

Image

Check how to use summaries on WorldSupporter.org
Submenu: Summaries & Activities
Follow the author: JesperN
Work for WorldSupporter

Image

JoHo can really use your help!  Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world

Working for JoHo as a student in Leyden

Parttime werken voor JoHo

Statistics
Search a summary, study help or student organization