Online survey researchers should be cautious with trick questions

Topics:

New studies suggest that researchers should be careful with setting “trap” questions for respondents for the sake of accurate research. 

Researchers use trap questions (also known as attention checks and instructional manipulation checks) to assess whether participants are paying attention to the instructions. However, most participants see these trick questions and become cautious about their answers, potentially altering a study’s results.

A pair of University of Michigan studies show that these instructional manipulation checks, or IMCs, which are popular measures used by researchers for online surveys, have unforeseen effects.

David Hauser, a U-M doctoral candidate, and colleague Norbert Schwarz of the University of Southern California, found that answering a trap question changes the way people respond with later questions.

“IMCs cause participants to think harder when answering survey questions than they normally would in order to avoid potentially being tricked again,” Hauser said.

IMCs in surveys look like normal questions. However, hidden in a large block of instructions are specific commands that tell participants to ignore the specific question and to submit a non-intuitive response instead. Participants who miss those special instructions and answer the question as normal are considered not being attentive.

For instance, an IMC might involve this scenario: Under the topic “sports participation,” respondents are asked, “Which of these activities do you engage in regularly?” followed by a list of sports to select.

However, above the question, embedded within a block of instructions a command indicates that, in order to demonstrate attention, respondents should click the “other” option and enter “I read the instructions” in the corresponding text box. Following these instructions is scored as passing the trap.

In the first U-M study, participants received a trap question and the Cognitive Reflection Test, a math test assessing the analytical abilities of participants. Half of the participants completed the trap question before the math test, whereas the other half completed the math test first.

Hauser and Schwarz found that completing a trap question first increased participants’ analytical thinking scores on the math test.

For the second study, participants received a trap question and a reasoning task assessing biased thinking. Again, half of the participants completed the trap question before the reasoning task, whereas the other half completed the reasoning task first.

The researchers found that completing the trap question first decreased biased thinking and led to more correct answers. Thus, completing a trap question made participants think more systematically about later questions.

Hauser believes many social scientists have used trick questions in their work, which have possibly affected their results. He says that while sometimes deeper thinking may be desirable for ensuring that participants are paying attention, it also could lead to study results that might not otherwise occur when participants are thinking as they normally would in everyday life.

Tags:

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.