Newswise — WASHINGTON, D.C., May 15, 2014 ─ “Mischievous responders” play the game of intentionally providing inaccurate answers on anonymous surveys, a widespread problem that can mislead research findings. However, new data analysis procedures may help minimize the impact of these “jokester youths,” according to research published online today in Educational Researcher, a peer-reviewed journal of the American Educational Research Association (AERA).

VIDEO: Author Joseph P. Robinson-Cimpian discusses key findings.

Inaccurate Estimation of Disparities Due to Mischievous Responders: Several Suggestions to Assess Conclusions,” by Joseph P. Robinson-Cimpian of the University of Illinois at Urbana-Champaign, suggests that mischievous responders are present in commonly used data sets and that their presence may severely alter impressions of relative risk between groups of adolescents. The article introduces novel sensitivity-analysis procedures for investigating and reducing the bias that mischievous responders often introduce in research on adolescents.

Mischievous responders are youths who provide extreme and potentially untruthful responses to multiple questions on self-administered questionnaires (SAQs). By providing misleading responses that they think are funny, these responders, even in small numbers, can lead researchers to wildly incorrect conclusions. In turn, the conclusions can lead to ineffective policymaking and may perpetuate negative stereotypes about marginalized groups.

In his article, Robinson-Cimpian demonstrates that even a very small proportion of mischievous responders can lead to exaggerated risk estimates for lesbian, gay, bisexual, and questioning (LGBQ), transgender, and disabled youths, with regard to drug usage, suicidal thoughts, and school disengagement. For example, potentially mischievous responders caused LGBQ–heterosexual disparities in recent cocaine/crack use to inflate from 4 percentage points to 12 percentage points.

To identify mischievous responders, Robinson-Cimpian’s method relies on detecting patterns of unusual answers provided by survey respondents.

“If we find that youths reporting to be gay are more likely than those reporting to be straight to say that they are blind and deaf and extremely tall and parenting multiple children all at the same time, then we might question whether the data are valid,” said Robinson-Cimpian. “Just like these jokester youths think it’s funny to say that they are gay and blind, they also think it’s funny to say that they are suicidal, engage in sexually risky behavior, and take drugs. And this can dramatically affect our estimates of risk.”

SAQs are often used to estimate the psychological and health disparities between groups, including adoptees and non-adoptees, sexual minorities and non-minorities, racial/ethnic minorities and non-minorities, and individuals with and without disabilities, among others. The anonymity often provided by SAQs makes them an important tool in adolescent research, but Robinson-Cimpian cautions that researchers must take steps to help ensure the data are valid.

“Within the past decade, much research has been criticized for failure to replicate and for exaggerated results,” wrote Robinson-Cimpian in his article. He also said, “If we want sound research and policy, we need to have sound data. The procedures introduced here have broad relevance to research and can be widely, and easily, implemented.”

The proposed approach does not require additional data collection, which makes it more practical, cost-effective, and applicable to already collected data. Robinson-Cimpian’s technique is unique in that it can be used in situations where it is impossible or unethical to verify information by comparing adolescents’ responses with those of their parents or by conducting in-person follow-up interviews with the respondents.

The four-step sensitivity analysis proposed by Robinson-Cimpian requires first identifying youths who provide high numbers of low-frequency responses and then comparing estimated disparities when including and excluding these youths.

While it is possible that some respondents will be mistakenly identified as mischievous, or that some mischievous respondents will not provide enough low-frequency responses to be identified as mischievous, there is great potential for both scientific and practical damage when no analysis is performed to assess data validity and ensure robust results, according to Robinson-Cimpian. This includes perpetuated stigma against minority groups.

For instance, this type of sensitivity analysis may have prevented well-intentioned research from being used by groups that claim LGBTQ identification leads to substance abuse and other risky behaviors, and thus advocate against using school resources to help LGBTQ youths, wrote Robinson-Cimpian in his article.

Funding NoteRobinson-Cimpian’s research was partially funded through the National Academy of Education/Spencer Postdoctoral Fellowship Program.

About the AuthorJoseph P. Robinson-Cimpian is assistant professor of quantitative and evaluative research methodologies at the University of Illinois at Urbana-Champaign.

About AERAThe American Educational Research Association (AERA) is the largest national professional organization devoted to the scientific study of education. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on Facebook and Twitter.

This news release is available online.

###