Data Quality in Survey Research

Introduction

Special Issue of Journal of Marketing Theory and Practice; Proposal deadline 1 Oct 2023

INTEREST CATEGORY: MARKETING RESEARCH
POSTING TYPE: Calls: Journals

Author: Elizabeth Wilson


Data Quality in Survey Research: A Proposal for a Special Issue of Journal of Marketing Theory and Practice

Edited by Dr. Elizabeth J. Wilson and Dr. Robert S. Smith (both at Suffolk University, USA)

The focus of this special issue is data quality in survey research. The use of survey methodology is ubiquitous in social science research (de Leeuw et al. 2008; Gideon 2012; Groves et al. 2009). Information (data) from surveys may have profound effects on decision-making in business, politics/government, and healthcare, to name a few broad areas.

Threats to the quality of survey data are many and varied given the advances in electronic technology for data collection. For example, respondents may be humans acting unethically with the goal of capturing an incentive rather than offering true/meaningful information. Such behavior constitutes fraud.  Baumgarner and Weijters (2022) explore careless responding while Pasternak (2019) describes “clickfarming” – a situation in which a respondent sets up multiple networks to complete surveys fraudulently just to capture the incentive.  In some instances, respondents may be non-human (i.e., bots) when humans are expected. In short, fraud is a serious threat to data quality.

Other threats to data quality may be due to problematic research design. DeSimone and Harms (2018) describe how survey data can be compromised without proper respondent screening. Roster et al. (2017) found topic sensitivity may affect data quality.  Albaum  et al. (2010) maintain that not including a “prefer not to answer” option in a forced-choice scale may decrease data quality. On a brighter note, Ladik et al. (2007) suggests university sponsorships for a survey can be a positive signal which may increase response rates for legitimate respondents (as opposed to fraudulent respondents).

The Insights Association, a leading non-profit organization representing the marketing research industry, recently highlighted the importance of data quality.

In fact, insights are increasingly performed by a brand’s internal non-research resources, who opt for user-driven tools and “roughly right” speed to insights over expert researchers and traditional, slower and more accurate research approaches. As a result, ensuring both providers and users of sample and data services are well versed on data integrity is of the utmost importance to preserving sound decision-making and the reputation and value of the marketing research industry. (Insights Association, Data Integrity Initiative 2022)

In this special issue of Journal of Marketing Theory and Practice (JMTP), we will publish papers that specifically focus on data quality.

Possible Topics:

  • Survey fraud – how to recognize and combat fraud
  • Data collection strategies – via social media, email, panels, and other sources
  • Survey Design (included but not limited to)
    • Evaluation and use of online panels (i.e., sample sourcing)
    • Metrics to evaluate data quality (i.e., in survey checks)
  • Reporting data quality to clients and/or research stakeholders

The editors seek to publish leading-edge papers on any aspect of data quality in survey research in the wake of technology, artificial intelligence (i.e., bots) and other novel developments. We encourage submissions with a strong practitioner orientation. The JMTP editors have no preferable methodologies; we are open to qualitative, quantitative, theoretical and historical/case study approaches.

Submission Guidelines and Deadlines

Send your proposal/abstract to the special issue editors by October 1, 2023.

The target date for full submission is January 31, 2024.  Papers targeting the special section should be submitted through the JMTP submission system (https://mc.manuscriptcentral.com/jmtp) and will undergo a similar review process as regularly submitted papers.

Special Issue Editors

Questions pertaining to the special section should be submitted to the JMTP Editorial Office or directed to one of the special issue editors

Dr. Elizabeth J. Wilson (ewilson@suffolk.edu) is Professor of Marketing in the Sawyer Business School at Suffolk University.

Dr. Robert S. Smith (rssmith@suffolk.edu) is Associate Professor of Marketing in the Sawyer Business School at Suffolk University.

References

Albaum, G., Roster, C. A., Wiley, J., Rossiter, J., & Smith, S. M. (2010). Designing web surveys in marketing research: Does use of forced answering affect completion rates?. Journal of Marketing Theory and Practice, 18(3), 285-294.

Baumgartner, H., & Weijters, B. (2022). How to identify careless responders in surveys. In Measurement in Marketing (Vol. 19, pp. 121-141). Emerald Publishing Limited.

de Leeuw, E. D., Hox, J. J., & Dillman, D. A. (eds). (2008). International handbook of survey methodology. Taylor & Francis Group/Lawrence Erlbaum Associates.

Groves, RM, Fowler, FJ, Couper, MP, Lepkowski, JM, Singer, E and R Tourangeau (eds). (2009) Survey Methodology. John Wiley & Sons, Inc. Hoboken, New Jersey.

Gideon, L. (2012). Introduction. In: Gideon, L. (eds) Handbook of Survey Methodology for the Social Sciences. Springer, New York, NY.

Insights Association (2022) “Data Integrity Initiative,” IS22-IAToolkit-v7.pdf,

Ladik, D. M., Carrillat, F. A., & Solomon, P. J. (2007). The effectiveness of university sponsorship in increasing survey response rate. Journal of Marketing Theory and Practice, 15(3), 263-271.

Pasternak, O. (2019) “Market Research Fraud: Distributed Survey Farms Exposed.” Greenbook.org ()

Roster, C. A., Albaum, G., & Smith, S. M. (2017). Effect of topic sensitivity on online survey panelists’ motivation and data quality. Journal of Marketing Theory and Practice, 25(1), 1-16