¿Influye el diseño de las preguntas en las respuestas de los entrevistados?

Autores/as

DOI:

https://doi.org/10.22325/fes/res.2022.83

Palabras clave:

no respuesta parcial, descomposición de preguntas, orden de las opciones de respuesta, efectos de respuesta

Resumen

En este trabajo se analiza hasta qué punto el orden de administración de las opciones de respuesta afecta a la falta de respuesta y a la elección de respuestas “no definidas”. Un segundo objetivo es considerar si la estrategia de “descomponer” una pregunta en varias disminuye el número preguntas sin respuesta, así como de respuestas “no definidas”. Un experimento con encuesta telefónica muestra que presentar las preguntas del cuestionario comenzando por las opciones desfavorables en preguntas descompuestas aumenta las respuestas ‘no sabe’, además de ampliar el tiempo de respuesta. Por el contrario, utilizar preguntas con opciones de respuesta que comienzan por las desfavorables reduce el rechazo a responder la pregunta. Las respuestas indefinidas se reducen cuando se emplea una pregunta que comienza por preguntas favorables. 

Biografía del autor/a

Vidal Díaz de Rada, Universidad Pública de Navarra

Licenciado (1991) y Doctor (1994) en Sociología y Licenciado en Investigación y Técnicas de Mercado con premio nacional de Excelencia en el Rendimiento académico Universitario. Ha colaborado en tareas docentes en el Departamento de Técnicas de Investigación Social de la Universidad de Deusto y en el año 1994 se incorpora al es profesor en el Departamento de Sociología de la Universidad Pública de Navarra; obteniendo la plaza de profesor titular en el año 2002. Desde febrero del año 2013 estoy acreditado como Catedrático de Universidad.

Citas

Aldrich, J. H., Niemi, R. G., Rabinowitz, G., & Rohde, D. W. (1982). The Measurement of Public Opinion about Public Policy: A Report on Some New Issue Question Formats. American Journal of Political Science, 26(2), 391-414. https://doi.org/10.2307/2111047

Alwin, D. F. (1992). Information Transmission in the Survey Interview: Number of Response Categories and the Reliability of Attitude Measurement. Sociological Methodology, 22, 83-118. https://doi.org/10.2307/270993

Armstrong, J. S., Denniston, W. B., & Gordon, M. M. (1975). The use of the decomposition principle in making judgments. Organizational Behavior & Human Performance, 14(2), 257–263. https://doi.org/10.1016/0030-5073(75)90028-8

Azofra, M. J. (1999). Cuestionarios. Madrid: Centro de Investigaciones Sociológicas.

Baka, A., Figgou, L., & Triga, V. (2012). ‘Neither Agree, nor Disagree’: A Critical Analysis of the Middle Answer Category in Voting Advice Applications. International Journal of Electronic Governance, 5(3/4), 244. https://doi.org/10.1504/ijeg.2012.051306

Barthel, M., et al. (2021). Measuring news consumption in a digital area. Pew Research Center. Recuperado el 11 de abril de 2021 de https://www.journalism.org/2020/12/08/news-consumption-methodology/

Berchtold, A. (2019). Treatment and reporting of item-level missing data in social science research. International Journal of Social Research Methodology, 22(5). 431-439. https://doi.org/10.1080/13645579.2018.1563978

Bishop, G. F. (1987). Experiments with the middle response alternative in survey questions. Public Opinion Quarterly, 51(2), 220-232. https://doi.org/10.1086/269030

Bishop, G. F. (1990). Issue involvement and response effects in public opinion surveys. Public Opinion Quarterly, 54(2), 209-218. https://doi.org/10.1086/269198

Bishop, G., & Smith, A. (2001). Response-Order Effects and the Early Gallup Split-Ballots. Public Opinion Quarterly, 65(4), 479-505. https://doi.org/10.1086/323575

Brace, I. (2018). Questionnaire design. How to plan, structure, and write survey material for effective market research. Londres: Kogan Page.

Bradburn, N. M. (1983). Response Effects. En P. H. Rossi, J. D. Wright, & A. B. Anderson (Eds.) Handbook of Survey Research (pp. 289-328). Nueva York: Academic Press.

Bradburn, N. (1992). What have we learned?. En N Schwarz & S. Sudman (Eds), Context Effects in Social and Psychological Research (pp. 315-325). Nueva York: Springer.

Cabrera-Álvarez, P., y Escobar, M. (2019). El efecto de la ponderación y la imputación en el sesgo de los estudios electorales en España. Revista Española de Investigaciones Sociológicas, (165), 45-64. https://dialnet.unirioja.es/descarga/articulo/6752512.pdf

Cantril, H. (1944). Gauging Public Opinion. Princeton: Princeton University Press.

Carlson, J. E., Mason, R., Saitiel, J., et al. (1995). Assimilation and contrast effects in general/specific questions. Rural sociology, 60(4), 666-673. https://doi.org/10.1111/j.1549-0831.1995.tb00599.x

Chang, L., & Krosnick, J. A. (2010). Comparing oral interviewing with self-administered computerized questionnaires: an experiment. Public Opinion Quarterly, 74(1), 154-167. https://doi.org/10.1093/poq/nfp090

Clifford, S., Kim, Y., & Sullivan, B. W. (2019). An improved question format for measuring conspiracy beliefs. Public Opinion Quarterly, 83(4), 690-722. https://doi.org/10.1093/poq/nfz049

Couper, M. P. (2008). Designing effective web surveys. Nueva York: Cambridge University Press. https://doi.org/10.1017/CBO9780511499371

De Leeuw, E., Hox, J. J., Scherpenzeel, A. C. (2010). Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys. Presented at the Annual Meeting of the American Association for Public Opinion Research, Chicago, IL, USA.

Debell, M., Wilson, C., Jackman, S., & Figueroa, L. (2021). Optimal Response Formats for Online Surveys: Branch, Grid, or Single Item?. Journal of Survey Statistics and Methodology, 9(1), 1-24. https://doi.org/10.1093/jssam/smz039

Díaz de Rada, V. (2000). Utilización de nuevas tecnologías para el proceso de “recogida de datos” en la investigación social mediante encuesta. Reis: Revista Española de Investigaciones Sociológicas, (91), 137-166. https://doi.org/10.2307/40184278

Díaz de Rada, V., y Ayerdi, P. (2007). Algunos problemas de la encuesta telefónica para la proyección electoral. Revista Española de Investigaciones Sociológicas, (118), 153-204.

Díaz de Rada, V., y Portilla, I. (2015). Encuestas telefónicas: estrategias para mejorar la colaboración. Perspectiva Empresarial, 2(1), 97-115. https://doi.org/10.16967/rpe.47

Díaz de Rada, V., Domínguez, J. A., y Pasadas, S. (2019). Internet como modo de administración de encuestas. Madrid: CIS.

Dillman, D. A. (1978). Mail and telephone surveys. Nueva York: Wiley.

Dillman, D. A. (2020). “Three Decades of Advancing Survey Methodology”. En T. W. Smith (Eds.), A Meeting Place and More…: A History of the American Association for Public Opinion Research (pp. 95-117). Washington, DC: AAPOR. https://www.aapor.org/About-Us/History/A-Meeting-Place-and-More.aspx

Dillman, D. A., et al (1995). Effects of Category Order on Answers in Mail and Telephone Surveys. Rural sociology, 60(4), 674-687. https://doi.org/10.1111/j.1549-0831.1995.tb00600.x

Dillman, D. A., Smyth, J., & Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (Cuarta ed.). Nueva York: Willey.

Escobar, M, y Jaime, A. M. (2013). Métodos de Imputación Múltiple para Predecir Resultados Electorales. En A. Mendoza Velazquez (Ed.), Aplicaciones en Economía y Ciencias Sociales con Stata. Texas: Stata Press.

Frey, J. (1989). Survey Research by Telephone. Londres: Sage.

Fuchs, M. (2005). Children and Adolescents as Respondents. Experiments on Question Order, Response Order, Scale Effects and the Effect of Numeric Values Associated with Response Options. Journal of Official Statistics, 21(4), 701-725.

Garbarski, D., Schaeffer, N. C., & Dykema, J. (2016). The Effect of Response Option Order on Self-Rated Health: A Replication Study. Quality of Life Research, 25(8), 2117-2121. https://doi.org/10.1007/s11136-016-1249-y

Gilbert, E. E. (2015). A comparison of branched versus unbranched rating scales for the measurement of attitudes in surveys. Public Opinion Quarterly, 79(2), 443-470. https://doi.org/10.1093/poq/nfu090

Groves, R. M., & Kahn, R. L. (1979). Surveys by telephone: a national comparison with personal interviews. Orlando, FL: Academic Press.

Gutiérrez, C., Sotomayor, R., y Garrido, F. (2004). Manual de encuestadores. Córdoba: Unidad de Estudios Aplicados del IESA–CSIC.

Hamilton, L. (2006). Statistics with STATA. Toronto: Thomson.

Häder, S., Häder, M., & Kuhne, M. (2012). Introduction: telephone surveys in Europe. En S. Häder, M. Häder & M. Kuhne (Eds.), Telephone Surveys in Europe: Research and Practice (pp. 5-30) Manheim: Springer Verlag.

Holbrook, A. L., Krosnick, J. A., Carson, R. T., et al. (2000). Violating Conversational Conventions Disrupts Cognitive Processing of Attitude Questions. Journal of Experimental Social Psychology, 36(5), 465-494. https://doi.org/10.1006/jesp.1999.1411

Holbrook, A. L., Krosnick, J. A., Moore, D., & Tourangeau, R. (2007). Response order effects in dichotomous categorical questions presented orally: the impact of question and respondent attributes. Public Opinion Quarterly, 71, 325-348. https://doi.org/10.1093/poq/nfm024

Insights Analytics España (29 de junio de 2021). Jornada Insights + Analythics: La prospectiva electoral: metodología y fiabilidad. Madrid. Recuperado el 26 de agosto de 2021 de https://ia-espana.org/evento-jornada-insights-analytics/

Kalton, G., Roberts, J., & Holt, D. (1980). The Effects of Offering a Middle Response Option with Opinion Questions. Journal of the Royal Statistical Society. Series D (The Statistician), 29(1), 65-78. https://doi.org/10.2307/2987495

Krosnick, J. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213-236. https://doi.org/10.1002/acp.2350050305

Krosnick, J. A. (1992). The Impact of Cognitive Sophistication and Attitude Importance on Response Order and Question Order Effects. En N. Schwarz & S. Sudman (Eds.), Context Effects in Social and Psychological Research (pp. 203-218). Nueva York: Springer.

Krosnick, J. A., & Alwin, D. F. (1987). An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opinion Quarterly, 51(2), 201-219. https://doi.org/10.1086/269029

Krosnick, J. A., & Berent, K. M. (1993). Comparison of party identification and policy preferences: the impact of survey question format. American Journal of Political Science, 37(3), 941-964. https://doi.org/10.2307/2111580

Krosnick, J. A., & Fabrigar, L. R. (1997). Designing Rating Scales for Effective Measurement in Surveys. En. L. Lyberg, P. Biemer, M. Collins, E. De Leeuw, C. Dippo, N. Schwarz, & D. Trewin (Eds.), Survey Measurement and Process Quality (pp. 141–64). Hoboken, Nueva York: John Wiley & Sons, Inc.

Krosnick, J. A., & Presser, S. (2010). Question and Questionnaire Design. En P. V. Marsden & J. D. Wright, Handbook of Survey Research (pp. 263-314). Emerald Group Publishing Limited.

Körmendi, E., & Noordhoek, H. (1989). Data quality and telephone interviews. A comparative study of face-to-face and telephone data collecting methods. Copenhagen: Danmarck Statistick.

Malhotra, N., Krosnick, J. A., & Thomas, R. K. (2009). Optimal design of branching questions to measure bipolar construct. Public Opinion Quarterly, 73(4), 304-324. https://doi.org/10.1093/poq/nfp023

Martin, E., et al. (2007). Guidelines for Designing Questionnaires for Administration in Different Modes. Washington: U.S.: Census Bureau, DC 20233.

Menold, N., & Bogner, K. (2016). Design of Rating Scales in Questionnaires. Mannheim, Germany: GESIS – Leibniz Institute for the Social Sciences. https://doi.org/10.15465/GESIS-SG_EN_015

Miller, P. V. (1984). Alternate Question Forms for Attitude Scale Questions in Telephone Interviews. Public Opinion Quarterly, 48(4), 766-778. https://doi.org/10.1086/268882

Moncus, J.J. (2021). What different survey modes and question types can tell us about Americans’ views of China. Pew Research Center, https://medium.com/pew-researchcenter-decoded/what-different-survey-modes-and-question-types-can-tell-usabout-americans-views-of-china-4523a47b5d99, acceso 14 de abril de 2021

Montagni, I., Cariou, T., Tzourio, C., & González-Caballero, J. (2019). ‘I don’t know’, ‘I’m not sure’, ‘I don’t want to answer?’: a latent class analysis explaining the informative value of nonresponse options in an online survey on youth health. International Journal of Social Research Methodology, 22(6), 651-667. https://doi.org/10.1080/13645579.2019.1632026

Nicolaas, G., Thomson, K., & Lynn, P. (2000). The feasibility of conducing Electoral Surveys in the UK by telephone. Londres: National Centre for Social Research.

Olson, K., & Smyth, J. D. (2015). The Effect of CATI Questions, Respondents, and Interviewers on Response Time. Journal of Survey Statistics and Methodology, 3(3), 361-396. https://doi.org/10.1093/jssam/smv006

Olson, K., Smyth, J. D., & Kirchner, A. (2020). The Effect of Question Characteristics on Question Reading Behaviors in Telephone Surveys. Journal of Survey Statistics and Methodology, 8(4), 636-666. https://doi.org/10.1093/jssam/smz031

O’Muircheartaigh, C., Krosnick, J. A, Helic, A. (mayo de 1999). Middle alternatives, acquiescence, and the quality of questionnaire data. Paper presented at the American Association for Public Opinion Research Annual Meeting, St. Petersburg, FL.

Payne, S. L (1951). The art of asking questions. Princeton: Princeton University Press.

Peytchev, A., Couper, M. P., McCabe, S. E., & Crawford, S. D. (2006). Web survey design: Paging versus scrolling. International Journal of Public Opinion Quarterly, 70(4), 596-607. https://doi.org/10.1093/poq/nfl028

Raghunatham, T., Berglund, P. A., & Solenberg, P. W. (2018). Multiple imputation in practice. Londres: Taylor.

Roberts, C., Gilbert, E., Allum, N., & Eisner, L. (2019). Research Synthesis: Satisficing in Surveys: A Systematic Review of the Literature. Public Opinion Quarterly, 83(3), 598-626. https://doi.org/10.1093/poq/nfz035

Schaeffer, N. C., & Presser, S. (2003). The science of asking questions. Annual Review of Sociology, 29(1), 65-88. https://doi.org/10.1146/annurev.soc.29.110702.110112

Schuman, H., & Presser, S. (1996). Questions and answers in attitudes surveys. Londres: Sage.

Schwarz, N., & Bless, H. (1992). Assimilation and contrast effects in attitude measurement: an inclusion/exclusion model. Public Advanced in Consumer Research, 19, 72-77. https://www.acrwebsite.org/volumes/7271/volumes/v19/

Schwarz, N., & Sudman, S. (Eds.). (1992). Context Effects in Social and Psychological Research. Nueva York: Springer.

Schwarz, N., Hippler, H.-J., & Noelle-Neumann, E. (1992). A Cognitive Model of Response Order Effects in Survey Measurement. En N. Schwarz & S. Sudman (Eds.), Context Effects in Social and Psychological Research (pp. 187-202). Nueva York: Springer.

Smyth, J. D.; Israel, G. D., Newberry III, M. G., & Hull, R. G. (2019). Effects of Stem and Response Order on Response Patterns in Satisfaction Ratings. Field Methods, 31(3), 260-276. https://doi.org/10.1177/1525822X19860648

Sudman, S., & Bradburn, N. (1982). Answering questions. San Francisco: Jossey Baas.

Sykes, W., & Goinville, G. (1985). Telephone interviewing on a survey of social attitudes. A comparison with face-to-face procedures. Londres: SCPR.

Treier, S., & Hillygus, D. S. (2009). The Nature of Political Ideology in the Contemporary Electorate. Public Opinion Quarterly, 73(4), 679-703. https://doi.org/10.1093/poq/nfp067

Tourangeau, R., Couper, M. P., & Conrad, F. G. (2013). "Up Means Good": The Effect of Screen Position on Evaluative Ratings in Web Surveys. Public opinion quarterly, 77(Suppl 1), 69-88. https://doi.org/10.1093/poq/nfs063

Yu, J. H., Albaum, G., & Swenson, M. (2003). Is a Central Tendency Error Inherent in the Use of Semantic Differential Scales in Different Cultures?. International Journal of Market Research, 45(2), 213-228. https://doi.org/10.1177/2F147078530304500203

Truebner, M. (2021). The Dynamics of Neither Agree Nor Disagree Answers in Attitudinal Questions. Journal of Survey Statistics and Methodology, 9(1), 51-72. https://doi.org/10.1093/jssam/smz029

Wetzelhütter, D. (2020). Scale-Sensitive Response Behavior!? Consequences of Offering versus Omitting a ‘Don’t Know’ Option and/or a Middle Category. Survey Practice, 13(1). https://doi.org/10.29115/SP-2020-0012

Publicado

2021-09-28

Cómo citar

Díaz de Rada, V. (2021). ¿Influye el diseño de las preguntas en las respuestas de los entrevistados?. Revista Española De Sociología, 31(1), a83. https://doi.org/10.22325/fes/res.2022.83