Validación de una Herramienta de Evaluación Basada en el Modelo Rasch para Medir la Resolución Creativa de Problemas en Estudiantes Mediante el Uso de TIC [Rasch Measurement Validation of an Assessment Tool for Measuring Students’ Creative Problem-Solving through the Use of ICT]
PDF (English)
PDF (Español)

Métricas alternativas

Palabras clave

Resolución creativa de problemas
TIC
educación
género
medición Rasch
psicometría Creative problem solving
education
gender
psychometric
Rasch measurement
ICT

Cómo citar

Farida, F., Alamsyah, Y. A., Anggoro, B. S., Andari, T., & Lusiana, R. (2024). Validación de una Herramienta de Evaluación Basada en el Modelo Rasch para Medir la Resolución Creativa de Problemas en Estudiantes Mediante el Uso de TIC [Rasch Measurement Validation of an Assessment Tool for Measuring Students’ Creative Problem-Solving through the Use of ICT]. Pixel-Bit. Revista De Medios Y Educación. https://doi.org/10.12795/pixelbit.107973

Resumen

A pesar del creciente reconocimiento de la importancia de la resolución creativa de problemas (CPS) a través del uso de las TIC en la educación con un currículo independiente, existe una falta de validación psicométrica integral para los instrumentos de evaluación de CPS. Este estudio tuvo como objetivo desarrollar y evaluar un instrumento de evaluación para medir la CPS a través del uso de las TIC en estudiantes, utilizando el modelo Rasch. Participaron un total de 137 estudiantes de educación superior como encuestados. Para este propósito, se crearon 20 ítems que cubrían diferentes aspectos de la CPS. El análisis de datos se realizó utilizando el software Winstep y SPSS. Se empleó el modelo Rasch para confirmar la validez y fiabilidad del instrumento de medición recién desarrollado. Los hallazgos del análisis del modelo Rasch indicaron un buen ajuste entre los ítems de evaluación y los estudiantes individuales. Los ítems demostraron un ajuste adecuado con el modelo Rasch, lo que permitió diferenciar los niveles de dificultad entre diferentes ítems y mostró un nivel satisfactorio de fiabilidad. El análisis del mapa de Wright reveló patrones de interacción entre los ítems y los individuos, discriminando efectivamente entre los diversos niveles de habilidades de los estudiantes. En particular, un ítem mostró DIF basado en el género, lo que favorece a los estudiantes varones en términos de sus habilidades de respuesta. Además, el estudio identificó que las estudiantes en el cuarto semestre exhibieron habilidades de respuesta promedio más altas en comparación con las estudiantes en el sexto y octavo semestre. Además, se observaron diferencias significativas en las habilidades de respuesta entre estudiantes varones y mujeres, así como entre estudiantes que residen en áreas urbanas y rurales. Estos hallazgos son cruciales para los educadores, enfatizando la necesidad de implementar estrategias de diferenciación efectivas

https://doi.org/10.12795/pixelbit.107973
PDF (English)
PDF (Español)

Citas

Abdulla Alabbasi, A. M., Hafsyan, A. S., Runco, M. A., & AlSaleh, A. (2021). Problem finding, divergent thinking, and evaluative thinking among gifted and nongifted students. Journal for the Education of the Gifted, 44(4), 398–413. doi: 10.1515/ctra-2018-0019

Abosalem, Y. (2016). Assessment techniques and students’ higher-order thinking skills. International Journal of Secondary Education, 4(1), 1–11. doi: 10.11648/j.ijsedu.20160401.11

Andrich, D. (2017). Advances in social measurement: A Rasch measurement theory. In Perceived health and adaptation in chronic disease (pp. 66–91). Routledge.

Angoff, W. H. (2012). Perspectives on differential item functioning methodology. In Differential item functioning (pp. 3–23). Routledge. Retrieved from https://www.taylorfrancis.com/chapters/edit/10.4324/9780203357811-2/perspectives-differential-item-functioning-methodology-william-angoff

Bond, T., & Fox, C. M. (2015). Applying the Rasch Model: Fundamental Measurement in the Human Sciences, Third Edition (3rd ed.). New York: Routledge. doi: 10.4324/9781315814698

Boone, W. J., Staver, J. R., & Yale, M. S. (2013). Rasch analysis in the human sciences. Springer.

Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer. Retrieved from https://doi.org/10.1007/978-94-007-6857-4

Burns, D. P., & Norris, S. P. (2009). Open-minded environmental education in the science classroom. Paideusis, 18(1), 36–43. doi: 10.7202/1072337ar

Caena, F., & Redecker, C. (2019). Aligning teacher competence frameworks to 21st century challenges: The case for the European Digital Competence Framework for Educators (Digcompedu). European Journal of Education, 54(3), 356–369. doi: 10.1111/ejed.12345

Cappelleri, J. C., Lundy, J. J., & Hays, R. D. (2014). Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures. Clinical Therapeutics, 36(5), 648–662. doi: 10.1016/j.clinthera.2014.04.006

Care, E., & Kim, H. (2018). Assessment of twenty-first century skills: The issue of authenticity. Assessment and Teaching of 21st Century Skills: Research and Applications, 21–39. doi: 10.1007/978-3-319-65368-6_2

Carnevale, A. P., & Smith, N. (2013). Workplace basics: The skills employees need and employers want. In Human Resource Development International (Vol. 16, pp. 491–501). Taylor & Francis. Retrieved from 10.1080/13678868.2013.821267

Caty, G. D., Arnould, C., Stoquart, G. G., Thonnard, J.-L., & Lejeune, T. M. (2008). ABILOCO: a Rasch-built 13-item questionnaire to assess locomotion ability in stroke patients. Archives of Physical Medicine and Rehabilitation, 89(2), 284–290. doi: 10.1016/j.apmr.2007.08.155

Chan, S. W., Looi, C. K., & Sumintono, B. (2021). Assessing computational thinking abilities among Singapore secondary students: A rasch model measurement analysis. Journal of Computers in Education, 8(2), 213–236. doi: 10.1007/s40692-020-00177-2

Cho, S., & Lin, C.-Y. (2010). Influence of family processes, motivation, and beliefs about intelligence on creative problem solving of scientifically talented individuals. Roeper Review, 33(1), 46–58. doi: 10.1080/02783193.2011.530206

De Ayala, R. J., Kim, S.-H., Stapleton, L. M., & Dayton, C. M. (2002). Differential Item Functioning: A Mixture Distribution Conceptualization. International Journal of Testing, 2(3–4), 243–276. doi: 10.1080/15305058.2002.9669495

Eberle, B., & Stanish, B. (2021). CPS for kids: A resource book for teaching creative problem-solving to children. Routledge.

Farida, F., Supriadi, N., Andriani, S., Pratiwi, D. D., Suherman, S., & Muhammad, R. R. (2022). STEM approach and computer science impact the metaphorical thinking of Indonesian students’. Revista de Educación a Distancia (RED), 22(69). doi: 10.6018/red.493721

Florida, R. (2014). The creative class and economic development. Economic Development Quarterly, 28(3), 196–205. doi: 10.1177/0891242414541693

Gaeta, M., Miranda, S., Orciuoli, F., Paolozzi, S., & Poce, A. (2013). An Approach To Personalized e-Learning. Journal of Education, Informatics & Cybernetics, 11(1). Retrieved from https://www.iiisci.org/Journal/pdv/sci/pdfs/HEB785ZU.pdf

Greiff, S., Wüstenberg, S., Molnár, G., Fischer, A., Funke, J., & Csapó, B. (2013). Complex problem solving in educational contexts—Something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology, 105(2), 364.

Hao, J., Liu, L., von Davier, A. A., & Kyllonen, P. C. (2017). Initial steps towards a standardized assessment for collaborative problem solving (CPS): Practical challenges and strategies. Innovative Assessment of Collaboration, 135–156. doi: 10.1007/978-3-319-33261-1_9

Harding, S.-M. E., Griffin, P. E., Awwal, N., Alom, B. M., & Scoular, C. (2017). Measuring collaborative problem solving using mathematics-based tasks. AERA Open, 3(3), 2332858417728046. doi: 10.1177/2332858417728046

Kyngdon, A. (2008). The Rasch model from the perspective of the representational theory of measurement. Theory & Psychology, 18(1), 89–109. doi: 10.1177/095935430708692

Lee, H., & Geisinger, K. F. (2014). The Effect of Propensity Scores on DIF Analysis: Inference on the Potential Cause of DIF. International Journal of Testing, 14(4), 313–338. doi: 10.1080/15305058.2014.922567

Lestari, W., Sari, M. M., Istyadji, M., & Fahmi, F. (2023). Analysis of Implementation of the Independent Curriculum in Science Learning at SMP Negeri 1 Tanah Grogot Kalimantan Timur, Indonesia. Repository Universitas Lambung Mangkurat. doi: 10.36348/jaep.2023.v07i06.001

Linacre, J. M. (2020). Winsteps®(Version 4.7. 0)[Computer Software].(4.7. 0). Winsteps. Com.

Lorusso, L., Lee, J. H., & Worden, E. A. (2021). Design thinking for healthcare: Transliterating the creative problem-solving method into architectural practice. HERD: Health Environments Research & Design Journal, 14(2), 16–29. doi: 10.1177/193758672199422

Mäkiö, E., Azmat, F., Ahmad, B., Harrison, R., & Colombo, A. W. (2022). T-CHAT educational framework for teaching cyber-physical system engineering. European Journal of Engineering Education, 47(4), 606–635. doi: 10.1080/03043797.2021.2008879

Mitchell, I. K., & Walinga, J. (2017). The creative imperative: The role of creativity, creative problem solving and insight as key drivers for sustainability. Journal of Cleaner Production, 140, 1872–1884. doi: 10.1016/j.jclepro.2016.09.162

Montgomery, K. (2002). Authentic tasks and rubrics: Going beyond traditional assessments in college teaching. College Teaching, 50(1), 34–40. doi: 10.1080/87567550209595870

Panayides, P., Robinson, C., & Tymms, P. (2010). The assessment revolution that has passed England by: Rasch measurement. British Educational Research Journal, 36(4), 611–626. doi: 10.1080/01411920903018182

Planinic, M., Boone, W. J., Susac, A., & Ivanjek, L. (2019). Rasch analysis in physics education research: Why measurement matters. Physical Review Physics Education Research, 15(2), 020111. doi: 10.1103/PhysRevPhysEducRes.15.020111

Rusch, T., Lowry, P. B., Mair, P., & Treiblmaier, H. (2017). Breaking free from the limitations of classical test theory: Developing and measuring information systems scales using item response theory. Information & Management, 54(2), 189–203. doi: 10.1016/j.im.2016.06.005

Samson, P. L. (2015). Fostering student engagement: Creative problem-solving in small group facilitations. Collected Essays on Learning and Teaching, 8, 153–164. doi: 10.22329/celt.v8i0.4227

Selfa-Sastre, M., Pifarre, M., Cujba, A., Cutillas, L., & Falguera, E. (2022). The role of digital technologies to promote collaborative creativity in language education. Frontiers in Psychology, 13, 828981. doi: 10.3389/fpsyg.2022.828981

Soeharto, S. (2021). Development of a Diagnostic Assessment Test to Evaluate Science Misconceptions in Terms of School Grades: A Rasch Measurement Approach. Journal of Turkish Science Education, 18(3), 351–370. doi: 10.36681/tused.2021.78

Soeharto, S., & Csapó, B. (2022). Assessing Indonesian student inductive reasoning: Rasch analysis. Thinking Skills and Creativity, 101132. doi: 10.1016/j.tsc.2022.101132

Sousa, F. C., Monteiro, I. P., Walton, A. P., & Pissarra, J. (2014). Adapting creative problem solving to an organizational context: A study of its effectiveness with a student population. Creativity and Innovation Management, 23(2), 111–120. doi: 10.1111/caim.12070

Stankovic, J. A., Sturges, J. W., & Eisenberg, J. (2017). A 21st century cyber-physical systems education. Computer, 50(12), 82–85. doi: 10.1109/MC.2017.4451222

Steiner, G. (2009). The Concept of Open Creativity: Collaborative Creative Problem Solving for Innovation Generation-a Systems Approach. Journal of Business & Management, 15(1).

Suastra, I. W., Ristiati, N. P., Adnyana, P. P. B., & Kanca, N. (2019). The effectiveness of Problem Based Learning-Physics module with authentic assessment for enhancing senior high school students’ physics problem solving ability and critical thinking ability. Journal of Physics: Conference Series, 1171(1), 012027. IOP Publishing. doi: 10.1088/1742-6596/1171/1/012027

Suherman, S., & Vidákovich, T. (2022). Assessment of Mathematical Creative Thinking: A Systematic Review. Thinking Skills and Creativity, 101019. doi: 10.1016/j.tsc.2022.101019

Suryanto, H., Degeng, I. N. S., Djatmika, E. T., & Kuswandi, D. (2021). The effect of creative problem solving with the intervention social skills on the performance of creative tasks. Creativity Studies, 14(2), 323–335. doi: 10.3846/cs.2021.12364

Treffinger, D. J. (2007). Creative Problem Solving (CPS): Powerful Tools for Managing Change and Developing Talent. Gifted and Talented International, 22(2), 8–18. doi: 10.1080/15332276.2007.11673491

Utami, Y. P., & Suswanto, B. (2022). The Educational Curriculum Reform in Indonesia: Supporting “Independent Learning Independent Campus (MBKM)”. SHS Web of Conferences, 149. EDP Sciences. doi: 10.1051/shsconf/202214901041

Val, E., Gonzalez, I., Lauroba, N., & Beitia, A. (2019). How can Design Thinking promote entrepreneurship in young people? The Design Journal, 22(sup1), 111–121. doi: 10.1080/14606925.2019.1595853

van Hooijdonk, M., Mainhard, T., Kroesbergen, E. H., & van Tartwijk, J. (2020). Creative problem solving in primary education: Exploring the role of fact finding, problem finding, and solution finding across tasks. Thinking Skills and Creativity, 37, 100665. doi: 10.1016/j.tsc.2020.100665

Wang, M., Yu, R., & Hu, J. (2023). The relationship between social media-related factors and student collaborative problem-solving achievement: An HLM analysis of 37 countries. Education and Information Technologies, 1–19. doi: 10.1007/s10639-023-11763-z

Wheeler, S., Waite, S. J., & Bromfield, C. (2002). Promoting creative thinking through the use of ICT. Journal of Computer Assisted Learning, 18(3), 367–378. doi: 10.1046/j.0266-4909.2002.00247.x

Wolcott, M. D., McLaughlin, J. E., Hubbard, D. K., Rider, T. R., & Umstead, K. (2021). Twelve tips to stimulate creative problem-solving with design thinking. Medical Teacher, 43(5), 501–508. doi: 10.1080/0142159X.2020.1807483

Zieky, M. (2012). Practical questions in the use of DIF statistics in test development. In Differential item functioning (pp. 337–347). Routledge. Retrieved from https://www.taylorfrancis.com/chapters/edit/10.4324/9780203357811-20/practical-questions-use-dif-statistics-test-development-michael-zieky

Creative Commons License

Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial-SinDerivadas 4.0.

Derechos de autor 2024 Pixel-Bit. Revista de Medios y Educación

Descargas

Los datos de descargas todavía no están disponibles.