At least since Campbell and Fiske’s (1959) classic paper on measurement, researchers have been concerned with item format as a method variance source. The assumption is that if you use psychological scales with the same formats, correlations among those scales will be inflated. There have been few if any direct tests of item format method variance, so Ashley Nixon and I decided to conduct some experiments. Our recent paper in Occupational Health Science summarizes our results.
Item Format Method Variance
Campbell and Fiske’s argument was that each measure is driven by the underlying variable you are assessing plus the method you used to assess it. In more technical terms, the variability among people (or other entities) measured in your sample is due in part to method. If the same method is used to assess two variables, there will be shared or “common method variance” in both of your measures, and shared variance inflates correlations. One feature of method with psychological scales as are typically used in organizational (and other) research is item format, for example, are you asking people to agree-disagree with items (agreement), or are you asking how often something occurs (frequency). If item format is a source of method variance, we would expect correlations between two assessments to be higher when they use the same item format.
We conducted three experiments to compare agreement and frequency item formats with some common measures of work stressors and strains. Employed individuals completed surveys where we changed the item formats.
- Experiment 1. Everyone received the stressor measures using both formats, first one and then the other, to see if correlations with strain measures were different between the agreement and frequency versions.
- Experiment 2. We randomly assigned people to get either the agreement or the frequency formats of the stressor measures to see if correlations were different with strain measures.
- Experiment 3. We randomly assigned people to one of four conditions in which the item formats matched (either agreement or frequency for stressor and strain measures), or did not match (stressors used agreement and strains used frequency format or the opposite).
Item Format Method Variance Doesn’t Exist
The results across all three experiments consistently showed that there were few differences between formats. If anything, the matched formats had smaller rather than larger correlations between stressors and strains. These results question the assumption that item format makes a difference. We wonder how many other assumed sources of method variance might wind up having little effect on research results. Future research will be needed to check each one.
SUBSCRIBE TO PAUL SPECTOR’S BLOG: Enter your e-mail and click SUBSCRIBE