An Assessment of Technological Pedagogical Content Knowledge (TPACK) among Pre-service Teachers: A Rasch Model Measurement

 

 

 

 

Una evaluación del conocimiento del contenido pedagógico tecnológico (TPACK) entre profesores en formación: una medición del modelo de Rasch

 

 

 

 Dr. Komarudin Komarudin. Lecture Professor. Universitas Islam Negeri Raden Fatah Palembang. Indonesia

 Dr. Suherman Suherman. Lecturer and Researcher. University of Szeged, Hungary. Universitas Islam Negeri Raden Intan Lampung. Indonesia

 

 

 

 

 

 

 

 

 

 

 

Recibido: 2024/01/07 Revisado 2024/01/31 Aceptado: :2024/06/29 Online First: 2024/07/08 Publicado: 2024/09/01

 

 

Cómo citar este artículo:

Komarudin, K., & Suherman, S. (2024). Evaluación del conocimiento tecnológico pedagógico del contenido (TPACK) entre los profesores en formación: modelo de medición Rasch [An Assessment of Technological Pedagogical Content Knowledge (TPACK) among Pre-service Teachers: A Rasch Model Measurement]. Pixel-Bit. Revista De Medios Y Educación, 71, 59–82. https://doi.org/10.12795/pixelbit.107599

 

 

ABSTRACT

The increasing importance of integrating technology into educational environments has underscored the importance of Technological Pedagogical Content Knowledge (TPACK) for effective teaching in the 21st century. However, many pre-service teachers face challenges in proficiently accessing and utilising new technologies in their teaching practices. The existing literature lacks a thorough examination of the empirical aspects of TPACK instruments and their applicability across various educational settings and levels, particularly in non-Western contexts. This research aimed to evaluate and compare TPACK among pre-service teachers in Indonesia. A diverse sample of 405 Indonesian pre-service teachers from different disciplines participated by completing an online TPACK questionnaire. Confirmatory factor analysis and the Rasch model were used to validate the questionnaire, demonstrating a well-fitting model consistent with its theoretical framework and a satisfactory fit for individuals and items. Evaluation of TPACK among pre-service elementary, preschool and mathematics education teachers revealed superior performance by pre-service elementary school teachers. The robust psychometric properties make it suitable for exploring TPACK. This research lays the groundwork for further investigation of the empirical dimensions of TPACK in diverse educational contexts.

 

 

 

 

RESUMEN

Currently, the integration of technology in 21st-century education is becoming increasingly important; however, many pre-service teachers face difficulties in accessing and utilizing new technologies in their teaching practices. Despite the growing importance of technology, the existing literature still lacks empirical examination of TPACK instruments and their application in various educational contexts, particularly in non-Western countries such as Indonesia. This study involved 405 Indonesian pre-service teachers from various disciplines who completed an online TPACK questionnaire. Confirmatory factor analysis and the Rasch model used to validate this questionnaire indicated that the model aligns with the theoretical framework and is suitable for individuals and items. The evaluation results showed that pre-service elementary school teachers exhibited superior TPACK performance compared to pre-service early childhood education and mathematics teachers. The strong psychometric properties of this instrument make it suitable for further exploration of TPACK in various educational contexts. This research lays the groundwork for further investigation into the empirical dimensions of TPACK in diverse educational settings.

 

 

 

PALABRAS CLAVES· KEYWORDS

Pre-service teachers; Technological Pedagogical Content Knowledge (TPACK); validation questionnaire; Rasch model; Technological Education

Maestros en formación; Conocimiento tecnológico pedagógico del contenido (TPACK); cuestionario de validación; modelo rasch; Educación tecnológica;

 

 

 

 

 

1. Introduction

The integration of technology into educational settings has become increasingly important. Technological Pedagogical Content Knowledge (TPACK) has emerged as a crucial component of effective teaching in the 21st century. TPACK refers to teachers' ability to integrate technology into their teaching to enhance learning outcomes (Roussinos & Jimoyiannis, 2019). The TPACK framework has been widely adopted as a guide to understanding and developing technological and pedagogical knowledge among teachers. Recent digital teaching competence frameworks, such as DigCompEdu, further reinforce the importance of digital competence in education (Haşlaman et al., 2024). These frameworks provide detailed guidelines and standards for educators to effectively use digital tools and resources, ensuring that technology integration is pedagogically sound and contextually relevant. The synergy between the TPACK model and frameworks such as DigCompEdu highlights the need for a comprehensive approach to teacher training, focussing not only on the use of technology, but also on its pedagogical application to foster enhanced learning experiences (Redecker & Punie, 2017). In other words, digital integration in learning activities, much faster and more accessible way (Guillén-Gámez et al., 2024; Komarudin et al., 2024).

The first two decades of the 21st century have seen significant changes in preservice teacher education, particularly with the increasing availability of technology in classrooms (Almazroa & Alotaibi, 2023). However, many pre-service teachers face obstacles to accessing and effectively using new technologies in their teaching in Indonesia. These challenges include limited access to technological resources, inadequate training in technology integration, and a lack of institutional support for technological initiatives. Many teachers struggle to effectively incorporate technology into their classrooms (Abedi et al., 2024; Bolyard et al., 2024; Bray & Tangney, 2017; Park & Scanlon, 2024). Studies have identified the lack of technological and pedagogical content knowledge as a significant barrier for teachers to use technology in teaching (Ardiç & Isleyen, 2017; Kind, 2009; Stoilescu, 2015), highlighting the need for tools to assess technological knowledge (Smith & Zelkowski, 2023). Despite participating in technological professional development, teachers often fail to integrate available technology into classroom instruction (Fütterer et al., 2023; Lawless & Pellegrino, 2007).

Recognising this challenge is essential as it emphasises the critical importance of TPACK among pre-service teachers. Equipping future teachers with TPACK enables them to effectively integrate technology into their teaching practices and enhances student learning experiences (Elmaadaway & Abouelenein, 2023). By synthesising technological expertise with pedagogical and content knowledge, pre-service teachers are better prepared to navigate the complexities of modern education, fostering innovation and equipping students with the skills necessary for success in the digital age.

Self-report measures have been developed to assess teachers' confidence levels and perceptions about the effectiveness of technology in educational settings. Previous research has underscored the importance of TPACK in various educational settings, demonstrating its ability to enhance teaching practices and student learning outcomes. Koh (2019) and Baran et al. (2019) highlighted the positive impact of TPACK on teachers' instructional strategies and their confidence in integrating technology into classrooms. Zelkowski et al. (2013) developed and validated an instrument to measure the TPACK of secondary mathematics pre-service teachers in the United States, finding the construct reliable and valid. However, they noted that pre-service teachers struggled to discern self-report domains, such as Pedagogical Content Knowledge (PCK), Technological Content Knowledge (TCK) and Technological Pedagogical Knowledge (TPK). Furthermore, a study by Ong & Annamalai (2024) focused on the development of skills from the 21st century TPACK to create a model stage of ICT tasks for communication, collaboration, critical thinking and creative thinking. Their research found that TPACK-21st century skills were missing, while content knowledge and pedagogical content knowledge were emphasised in the planned curriculum. This research contributes to the larger effort to enhance pre-service mathematics teachers for effective technology integration.

Another study by Smith & Zelkowski (2023) validated a TPACK questionnaire instrument for middle- and high-school mathematics teachers in the United States, originally developed in Australia. The research, which involved a comparable national sample in the US, revealed differences in the factor structure of the Australian instrument within the US context. The finding led to the creation of a new validated instrument, TPACK-M-US, tailored for US pre-service secondary mathematics teachers. The study provided three sources of evidence for the validity of the instrument and discussed its appropriate uses and interpretations, emphasising the importance of validation research in educational settings. However, a limitation was that all data were self-reported, which could lead to an overestimation or an underestimation of TPACK by US participants.

Furthermore, Li et al. (2023) created and validated a TPACK scale for secondary mathematics teachers in China. The results demonstrated strong reliability and validity, making the scale a robust tool to assess TPACK and guide professional development and technology integration policies within Chinese mathematics education. Additionally, Sofyan et al. (2023) validated the TPACK instrument for the evaluation of elementary school teachers in Indonesia. Their research found that the items were valid and reliable to evaluate teacher TPACK and Internet use. However, the study was limited to focussing on the level of TPACK in classroom settings. Furthermore, Martin et al. (2024) developed and validated a self-audit survey for primary school pre-service teachers, which was also found to be valid and reliable. The limitation of their research was that the instrument needed to include items related to technological changes, especially the rapid evolution of artificial intelligence.

However, the existing literature lacks a comprehensive exploration of the empirical attributes of TPACK and its application across a wide range of educational settings and levels, particularly in non-Western contexts. Challenges, such as limited access to technological resources, inadequate training in technology integration, and insufficient institutional support for technology-related initiatives, exacerbate this gap. Most studies have mainly focused on Western countries, leaving a gap in our understanding of how TPACK operates in diverse cultural and educational environments, such as Indonesia. Furthermore, limited research examines how TPACK levels vary between different disciplines, including elementary, pre-school, and mathematics education pre-service teachers. Another significant gap arises from the reliance on self-reported measures to evaluate TPACK, raising concerns about potential response bias and its impact on the precision and consistency of research findings.

Furthermore, we aimed to refine and validate robust assessment tools that can accurately measure the TPACK levels of pre-service teachers. This effort contributes to optimising teacher education curricula and better-preparing teachers for the digital demands of contemporary classrooms. By achieving these objectives, this research aimed to offer valuable information on the effective measurement and enhancement of TPACK, thus supporting the advancement of teacher education and the seamless integration of technology into teaching practices across diverse educational settings.

 

1.1 Technological Pedagogical Content Knowledge

The concept of Technological Pedagogical Content Knowledge or TPACK serves as a framework for understanding and describing the types of knowledge a teacher needs to effectively practice pedagogy and improve concept understanding by integrating technology into the learning environment. Fundamentally, TPACK revolves around the relationship between subject matter, technology, and pedagogy (Elas et al., 2019; Irmak & Yilmaz Tüzün, 2019; Nordin et al., 2013; Reyes Jr et al., 2017). The interaction among these three components has the strength and appeal to foster active learning focused on learners (Malik et al., 2019). TPACK, one of the most recognized theoretical frameworks, was developed by Mishra & Koehler (2006) to ensure the integration and representation among technology, pedagogy, and content components. TPACK denotes the understanding that teachers need to effectively incorporate technology into their teaching across various content areas (Luik et al., 2024). This framework highlights that effective technology integration requires a nuanced understanding of the dynamic relationship between pedagogy, content, and technology, ultimately aiming to enhance educational outcomes and foster more engaging learning experiences. Mishra & Koehler (2006) emphasise that TPACK is not a universal skill applicable in the same way for every teacher, but rather a form of knowledge that varies according to different curriculums and teaching philosophies. They state that “quality teaching requires developing a nuanced understanding of the complex relationships between technology, content, and pedagogy and using this understanding to develop appropriate, context-specific strategies and representations”. Thus, the learning paradigm shifts from teacher-centred to learner-centred. Consequently, the basic theory of TPACK empowers teachers to develop the skills needed to make informed decisions about integrating technology into teaching, ensuring that its use supports students' understanding of the subject matter.

However, it is crucial for teachers to integrate technology with their content and pedagogical knowledge. A tangible example of TPACK is when a mathematics teacher employs simulation software to assist students in grasping abstract concepts. Through a combination of strong subject knowledge, sophisticated pedagogical skills, and judicious use of technology, learning becomes not only more engaging but also more effective. Thus, TPACK emerges as the key to shaping a generation that is not only technologically skilled, but also critical (Maskur et al., 2022), problem solving (Supriadi et al., 2024), creative (Suherman & Vidákovich, 2024) and prepared to face the challenges ahead.

 

1.2 The Components of Technological Pedagogical Content Knowledge

In the TPACK framework, there exists an interconnected relationship between its constituent components, namely content knowledge (CK), pedagogical knowledge (PK), and technological knowledge (TK). They overlap and influence each other in the context of learning. A holistic understanding of how these three dimensions relate and interact is crucial to support effective learning processes. The following is a detailed description of the basic theory of TPACK. Furthermore, the TPACK framework is illustrated in Figure 1 (Mishra & Koehler, 2006).

Figure1

The Dimensions of TPACK

 

Content Knowledge (CK) refers to knowledge of the subject matter to be learned, as outlined in the curriculum. It encompasses concepts, theories, ideas, frameworks, methods, and real-world applications (Flores-Castro et al., 2024).

Pedagogical Knowledge (PK) encompasses in-depth knowledge related to the theory and practice of teaching and learning, covering goals, processes, learning methods, evaluation, strategies, and more. It requires understanding cognitive, affective, and social, as well as developing learning theories and their practical application (Saubern et al., 2020).

Technology Knowledge (TK) includes the technology basics that support learning, such as software, animation programmes, Internet access, molecular models, and virtual laboratories. Teachers must be proficient in processing information and communicating with ICT in learning environments (Malik et al., 2019).

Pedagogical Content Knowledge (PCK) involves the interaction and intersection between pedagogy (P) and subject matter (C). PCK is the ability to transform content or subject matter for teaching purposes, including the learning process related to the subject matter and the assessment system (Saubern et al., 2020). Technology Content Knowledge (TCK) encompasses the relationship between technology and subject matter, understanding how technology can support and influence other components. It involves technological proficiency and subject matter domains (Mishra & Koehler, 2006). Technology Pedagogy Knowledge (TPK) integrates PK and TK, emphasising how technology can be applied effectively in teaching. It requires an understanding of the advantages and disadvantages of technology in the context of subject matter and the learning process (Schmidt et al., 2009).

Technological Pedagogical Content Knowledge (TPACK) integrates PK, CK and TK, summarising a series of learning in which the ability to master integrated technology is inseparable from its constituent components (C), (P), and (K). TPACK requires multiple interactions and combinations among components: subject matter, pedagogy, and technology (Mishra & Koehler, 2006). Teachers need the ability to effectively integrate technology into their teaching strategies to align with the subject matter and the needs of students.

 

2. Methodology

2.1. Participants

The research enlisted 405 pre-service teachers from various public and private universities. Among these participants, 52.8% identified as female, while 46.7% identified as male, with an average age of Mage = 19.58; SD = 1.006. Participants were selected from various districts and villages, representing a spectrum of living environments, ages, majors, and university types. The ethical clearance for the study was obtained from the Institutional Review Board of Universitas Islam Negeri Fatah Palembang, Indonesia, ensuring the adherence to the ethical guidelines. Before participating, all individuals provided their informed consent. Further demographic details of the participants are presented in Table 1.

Table 1

Characteristics of the Participants

Characteristics

n

Frequency (%)

Gender

 

 

Female

215

52.8

Male

190

46.7

Age

 

 

17

4

1.0

18

46

11.3

19

146

35.9

20

150

36.9

21

40

9.8

22

19

4.7

Type of universities

 

 

Private

267

65.6

Public

138

33.9

Major

 

 

PGMI

181

44.5

PIAUD

106

26.0

PSPM

118

29.0

Living place

 

 

City

184

45.2

Suburb

221

54.3

N = 405; Mage = 19.58; SD = 1.006; PGMI = Elementary teacher programme; PIAUD = Preschool teacher programme; PSPM = Mathematics teacher programme

2.2. Instrument

The TPACK instrument developed by Schmidt et al. (2009) served as the foundation of this research. Adapted to the Indonesian context, the instrument comprised seven dimensions. The first dimension addressed technology knowledge and comprised five items. The second dimension focused on content knowledge, encompassing 12 items. Pedagogical knowledge constituted the third dimension, comprising seven items. The fourth dimension was related to the pedagogical content knowledge, with 4 items. The fifth dimension was related to technological content knowledge, featuring 4 items. The sixth dimension addressed technological pedagogical knowledge with five items. Lastly, the seventh dimension addressed technological pedagogical content knowledge with eight items. The participants' responses were recorded using a five-point Likert scale, ranging from strongly agree (5) to strongly disagree (1).

 

2.3. Data Analysis

Participants voluntarily completed the questionnaire with confidential identification. To evaluate the validity of the questionnaire, Confirmatory Factor Analysis (CFA) was used, using parameters such as the Comparative Fit Index (CFI), the Tucker-Lewis Index (TLI), the Root Mean Square Error of Approximation (RMSEA), the Standardised Root Mean Square Residual (SRMR). The model fit criteria were established as CFI > .90, TLI > .90, RMSEA < .08, and SRMR < .06 (Hu & Bentler, 1999).

Additionally, the Rasch analysis further assessed the validity of the construct. This analysis evaluated the fit of individual items, considering parameters such as fit and fit mean square (MNSQ), ranging from 0.5 to 1.5 (Boone et al., 2014), as well as a positive value of the point-measure correlation (PTMA). The Differential Element Function (DIF) was also performed to identify potential bias toward specific sample groups.

Furthermore, descriptive and comparative analyses were conducted to profile students' TPACK and discern differences among teacher groups. Ordinal student responses were converted to logit values derived from Rasch analysis to estimate attitude levels, representing student performance in different aspects of a single trait (Boone et al., 2014). Data were analysed using the SPSS version 29, SmartPLS version 4, and Winstep programmes.

 

3. Analysis and results

3.1 The Validity of the Instrument

Validity analysis evaluates the quality of the questionnaire based on the theoretical model and the parameters of individual items (see Table 2). The results of the Confirmatory Factor Analysis (CFA) demonstrated satisfactory results for TPACK with seven latent variables:  = 2051.845, df = 2.262, p < .001, CFI = .92, TLI = .91, RMSEA = .05, and SRMR = .04. The factor loadings derived from the CFA consistently ranged from .45 to .85, indicating the alignment of the items with the explanation of the constructed variable (see Figure 2). It should be noted that all the questionnaire items effectively captured the dimensions of TPACK within each latent variable.

 

Table 2

The Item Validity of TPACK Based on the CFA and Rasch Analysis

Items

CFA Factor Loading

Rasch Analysis

 

Measure

SE

Infit MNSQ

Outfit MNSQ

PTMA

F1: Technology Knowledge (TK)

 

TK2

.63

-.54

.09

1.33

1.39

.54

TK3

.66

-1.07

.09

1.23

1.24

.56

TK5

.69

-.17

.09

1.60

1.90

.53

TK6

.74

.44

.09

1.27

1.29

.62

TK7

.73

.15

.09

1.31

1.34

.61

F2: Content Knowledge (CK)

 

CKL1

.73

.44

.09

.98

.98

.68

CKL2

.73

.65

.09

1.08

1.08

.68

CKL3

.77

.46

.09

.87

.86

.72

CKM1

.45

.57

.09

1.79

1.83

.48

CKM2

.59

.40

.09

1.31

1.36

.60

CKM3

.63

.47

.09

1.22

1.24

.63

CKS1

.68

.29

.09

1.10

1.09

.67

CKS2

.78

.52

.09

.83

.82

.74

CKS3

.71

.48

.09

.97

1.01

.67

CKT1

.70

-.35

.09

.94

.97

.66

CKT2

.70

-.28

.09

.90

.91

.66

CKT3

.69

.17

.09

1.07

1.05

.65

F3: Pedagogical Knowledge (PK)

 

PK1

.80

-21

.09

.89

.88

.72

PK2

.77

-.36

.09

.84

.84

.70

PK3

.79

-.42

.09

.92

.90

.70

PK4

.75

-.39

.09

.99

.97

.67

PK5

.75

-.27

.09

1.07

1.07

.67

PK6

.82

.04

.09

.81

.80

.74

PK7

.80

-.08

.09

.82

.81

.73

F4: Pedagogical Content Knowledge (PCK)

 

PCK1

.79

.04

.09

.76

.74

.74

PCK2

.81

.22

.09

.79

.78

.74

PCK3

.72

.18

.09

.97

.95

.68

PCK4

.81

.15

.09

.80

.78

.75

F5: Technological Content Knowledge (TCK)

 

TCK1

.80

.01

.09

.88

.86

.72

TCK2

.77

.00

.09

.84

.84

.72

TCK3

.83

.05

.09

.88

.85

.75

TCK4

.75

.07

.09

1.00

.98

.70

F6: Technological Pedagogical Knowledge (TPK)

 

TPK1

.77

-.20

.09

.88

.85

.72

TPK2

.85

-.36

.09

.79

.78

.74

TPK3

.71

-.66

.09

1.32

1.28

.61

TPK4

.76

-.38

.09

1.03

1.01

.67

TPK5

.85

-.20

.09

.74

.72

.74

F7:  Technological Pedagogical Content Knowledge (TPACK)

 

TPACK1

.82

.16

.09

.81

.80

.74

TPACK2

.83

.07

.09

.77

.76

.76

TPACK3

.80

-.30

.09

.80

.80

.73

TPACK4

.78

-.41

.09

.90

.90

.71

TPACK5

.82

.19

.09

.86

.85

.74

TPACK6

.78

.15

.09

.93

.93

.72

TPACK7

.82

.05

.09

.83

.82

.76

TPACK8

.71

.23

.09

1.09

1.08

.69

 

Rasch analysis yielded favourable MNSQ values for infit (Minfit = 0.98) and outfit (Moutfit = 1.02), indicating that the questionnaire items effectively assess the TPACK of pre-service teachers. However, for items TK5 and CKM1, the infit and outfit MNSQ values exceeded 1.5 (see Figure 3). These items were considered acceptable due to their positive point-measure correlation. Therefore, removing these items from the questionnaire would compromise the theoretical integrity of the measurement.

Regarding the parametric properties of the items, the logit measure of the overall items indicated proximity to 0 (Mlogit = -0.03, SD = 0.36), suggesting that the measured items were located at a moderate level (see Fig. 3). The questionnaire's most challenging items were CKL2 and CKM1, with students predominantly providing lower scores in their responses (logit measure = 0.65 and 0.57, respectively). On the contrary, the least challenging element was TK3 (logit measure = -1.07), where students consistently provided high confidence scores.

Furthermore, Rasch's analysis evaluated dimensionality, revealing that the average variance explained by the measure of the TPACK variables exceeded the critical point (35%). The point indicated that the questionnaire effectively measures only the dimension of TPACK.

Figure 2

CFA Model Fit

 

Figure 3

Wright Map of Items

 

Figure 4

DIF Analysis in Three Different Pre-service Teachers Major

The DIF analysis, conducted through Rasch analysis, aimed to assess the invariance of the questionnaire between groups, determining whether specific items exhibited different behaviours between different groups. In this research, we focused on measuring DIF between elementary school pre-service teachers in their first year, pre-service preschool teachers in their first year, and pre-service mathematics teachers in their first year.

The estimate for the DIF analysis was based on a significant probability (p < 0.05) with a large size estimation (≥0.64) (Boone et al., 2014). A significant result with a large size estimate indicated the presence of DIF in the item. Conversely, a nonsignificant result with a low size estimation suggested no DIF, while a significant result with a low size suggested negligible bias towards different groups. The DIF analysis in the three pre-service teacher groups produced non-significant results for each item in TPACK (p > 0.05) (see Figure 4).

Taking into account the results of both the CFA and the Rasch analysis, the TPACK framework demonstrated validity and could accurately measure the knowledge and skills of the pre-service teachers. Furthermore, the DIF analysis indicated that the questionnaire did not exhibit bias toward any specific group of pre-service teachers. Given these robust findings, TPACK is a suitable instrument for further assessment and evaluation of pre-service teachers, providing valuable information on their integration of technological, pedagogical, and content knowledge. These results underscore the importance and effectiveness of TPACK in assessing pre-service teachers' readiness to effectively integrate technology into their teaching practices.

 

3.2. Reliability

Reliability analysis was performed to assess the consistency of the participant's responses to the questionnaire. The criteria for a reliable coefficient required a range value of (r > 0.7) for an acceptable result (Wicaksono & Korom, 2023). The analysis revealed a favourable outcome for the TPACK, indicating that the questionnaire items consistently measured students' attitudes toward science (Table 3).

Table 3

The Reliability of the TPACK Questionnaire

Factors

Cronbach's Alpha

Coefficient ω

Person’s Reliability

Item’s Reliability

CK

.97

.96

.92

.93

PCK

.99

.94

.81

.85

PK

.99

.96

.90

.90

TCK

.98

.94

.84

.86

TK

.92

.95

.81

.83

TPACK

.99

.97

.91

.92

TPK

.98

.95

.87

.88

Total

.97

.98

.97

.94

 

The reliability measures for CK were notably high, with a Cronbach Alpha of .97, Coefficient ω of .96, Persons' reliability of .92, and the reliability of .93. Similarly, PCK exhibited even higher reliability, boasting a Cronbach Alpha of 0.99, Coefficient ω of .94, Persons' reliability of .81, and component reliability of .85. PK and TCK also demonstrated robust reliability, highlighting the stability and internal consistency of the questionnaire across these dimensions. Furthermore, the reliability measures for TK, TPACK and TPK consistently show high values, indicating the reliability of the questionnaire in assessing teachers' technological proficiency and its integration with pedagogy and content knowledge.

 

3.3. The Profile of Pre-service Teachers’ Technological Pedagogical Content Knowledge

Figure 5 depicts the TPACK of the pre-service teachers using a violin plot, offering a comprehensive comparison of various data points within the data set. This graphical representation, similar to a box plot, is specifically designed to showcase important statistical features, such as the symmetry of distribution, central tendency, and dispersion of data points (Potter et al., 2010). The violin plot enhances the visualisation of TPACK performance, facilitating a more detailed exploration of the dataset.

Analysis of logit values revealed notable findings regarding differences between groups for various variables in the study. Significant differences were observed in PCK, with F(2, 402) = 5.773, p < .001, indicating variations in logit values between groups. Similarly, PK showed significant differences between the groups, with F(2, 402) = 7.925, p < .001. TCK also demonstrated significance, as seen in F(2, 402) = 3.988, p < .05. Conversely, the total score, TPACK, and TPK did not show significant differences in logit values between the groups. Specifically, F(2, 402) = 0.405, p = 0.667 for Total, F(2, 402) = 1.267, p = 0.283 for TPACK, and F(2, 402) = 0.062, p = 0.940 for TPK. These findings provided insight into nuanced variations in logit values for different aspects of teacher knowledge and competencies across different groups.

In addition, logit value measurements were performed for three different programmes. In the PSPM programme, the Mlogit was 2.18, with an SD of 2.76. Furthermore, the logit values in PIAUD and PGMI were Mlogit = 2.28 (SD = 2.83) and Mlogit = 2.41 (SD = 2.72), respectively.

Figure 5

The Distribution of TPACK Based on the Pre-service Teachers’ Levels

Note: (1 = PSPM; 2 = PIAUD; 3 = PGMI)

3.4. The Correlation between the TPACK Variables

The researchers performed a TPACK correlation analysis to examine the relationships between the TPACK variables (Figure 6). For elementary pre-service teachers, the determination coefficient revealed that R-squared (R2) for PCK is .637, indicating that independent variables explain 63.7% of the variance in PCK. Similarly, TCK has an R2 of 0.679, suggesting that independent variables account for 67.9% of the variance in TCK. The general TPACK variable shows a higher R2 of .756, indicating that 75.6% of its variance is explained. Lastly, TPK has an R2 of .618, signifying that the independent variables explain 61.8% of the variance in TPK. These R^2 values provide insights into the predictive power of independent variables in each specific aspect of knowledge and skills.

Regarding preschool preservice teachers, PCK, TK, CK, TPK, PK, and TCK explained the TPACK at 78% (R2 = .780). Similarly, PCK was explained by CK and TK, accounting for 52.6% (R2 = .526) of the variance. Furthermore, TCK was explained by CK and PK, approximately 58.6% (R2 = .586). Then, TPK was explained by TK and PK, accounting for 65.9% (R2 = .659) of the variance.

For the pre-service mathematics teachers, five variables (PCK, TK, TPK, PK, and TCK) explained TPACK at 83.2% (R2 = .832). Similarly, PCK was explained by TK and CK, reaching approximately 61.5% (R2 = .615). Furthermore, CK explained TCK at 55.5% (R2 = .555), and TPK was explained by PK and TK, which account for 58.6% (R2 = .586) of the variance.

Figure 6

Correlations between TPACK variables among variables

Note: (a) PGMI = Elementary School Pre-service Teacher, (b) PIAUD = Preschool Pre-service Teachers, (c) PSPM = Mathematics Pre-service Teacher

4. Discussion

This research aimed to modify and validate a modified TPACK instrument for Indonesian pre-service teachers. Various statistical procedures, including Confirmatory Factor Analysis (CFA) and Rasch analysis, were employed to enhance the validity of the designed instrument. A 45-item questionnaire was developed to assess pre-service teachers' TPACK levels, with exploratory factor analysis revealing three variables. The high Kaiser-Meyer-Olkin (KMO) value of .97 indicated the instrument's ability to effectively distinguish the three latent factors. Rasch's analysis further affirmed the effectiveness of the questionnaire. However, some items (TK5 and CKM1) showed slightly elevated infit and outfit values, which were deemed acceptable due to their positive correlation with the overall construct. Given the alignment with the TPACK framework, these elements were retained as they pertained to essential knowledge, concepts, theories, and practical applications for pre-service teachers in their everyday contexts. Furthermore, this perspective recognizes the TPACK theoretical framework (Mishra & Koehler, 2006) and underscores the necessity for teachers to effectively integrate technology into their teaching across various content areas (Luik et al., 2024). Furthermore, the conceptual focus of these items emphasised the use of software, Internet access, and virtual laboratories, underscoring the importance of teachers' proficiency in information processing and effective communication through ICT in their instructional practices and STEM education (Malik et al., 2019; Suherman, 2018).

This research revealed the absence of Differential Item Functioning (DIF), which is crucial to ensure unbiased measurements between different groups. Previous research has highlighted the importance of evaluating DIF to maintain fairness in evaluations and interventions. When present, DIF may suggest potential biases in questionnaire items that could impact the instrument's validity and reliability. Both statistical and practical significance (effect size) in DIF analysis offers a comprehensive understanding of how biases could influence measurement results (Boone et al., 2014). The implications of identifying DIF are significant, particularly in educational evaluations and interventions, as changes to the questionnaire may be required to ensure fairness across various teacher education programmes. Furthermore, identifying specific items showing DIF can inform targeted intervention strategies or curriculum reforms aimed at meeting the particular needs of each group of pre-service teachers, thus enhancing the effectiveness of teacher training programmes (Lautenbach & Heyder, 2019).

The variance analysis conducted on the logit values for different teacher groups has yielded valuable insights into the nuanced distinctions within the elements of the TPACK framework (Castaño et al., 2015; Kimmons et al., 2015). Elementary, preschool, and mathematics pre-service teachers exhibited discernible differences in their interaction with specific items related to TPACK. These findings align with the existing literature, highlighting the diverse interpretations and reactions to PCK items across the groups (Hill et al., 2008). Such variations likely reflect the unique instructional needs or perspectives inherent to each group (König et al., 2020). Significant group-specific disparities were also observed in PK and TCK, indicating the influence of different educational environments, teaching obligations or focus areas of the respective programmes. However, aspects such as the overall TPACK score and TPK did not demonstrate significant variance between groups, suggesting a consensus on the understanding and applying broader TPACK constructs (Hall et al., 2020; Tondeur et al., 2020). These findings underscore the importance of recognising subtle yet distinct differences in how specialised pre-service teachers perceive and incorporate specific TPACK stages. Customising teacher education programmes to better meet particular needs and overcome challenges specific to various instructional domains and subjects is crucial.

When evaluating the effectiveness of the three educational programmes, the analysis focused on comparing logit values, indicative of the programme's ability to enhance certain competencies. It was observed that the pre-service elementary school teachers’ teaching positions tended to outperform the preschool and mathematics pre-service teachers in terms of mean logit values. Several factors may contribute to this performance distinction. First, the curriculum and training provided to elementary school pre-service teachers may be more comprehensive, leading to a stronger foundation in the assessed areas. It may reflect alignment of the curriculum with assessment objectives or a more adept implementation of instructional strategies that resonate with measured competencies. Second, the nature of elementary education can offer broader exposure to diverse teaching contexts and content areas, equipping pre-service teachers with a more versatile skill set. The findings resonate with previous research highlighting the critical role of curriculum coherence and instructional alignment in promoting TPACK among educators (Koh, 2019; Mishra & Koehler, 2006). Addressing disparities in TPACK development through targeted professional development and curriculum enhancements is crucial for preparing educators to meet the evolving demands of digital-age learning environments.

In contrast, the specialisation required for preschool and mathematics pre-service teachers might limit their exposure and, subsequently, their performance in the broad-based competencies assessed by logit values. Additionally, it is plausible that elementary school pre-service teachers' training programmes place greater emphasis on specific skills and knowledge areas evaluated in the study, directly influencing outcomes. Alternatively, assessment instruments may inherently favour competencies developed in elementary school pre-service teachers, contributing to observed performance discrepancies.

 

5. Limitations and future research

Although this study offers valuable insights into TPACK development among pre-service teachers in Indonesia, several limitations should be acknowledged. First, the predominantly Indonesian sample from public and private universities can restrict the generalisability of the findings to other global contexts. Furthermore, despite efforts to ensure demographic diversity, the potential bias inherent in self-reported data and responses on the Likert scale could have influenced the precision of the TPACK proficiency assessments. Additionally, although rigorous validation procedures were applied, certain items exhibited slightly elevated fit statistics in Rasch analysis, potentially affecting the instrument's reliability in specific contexts. Furthermore, the cross-sectional design limits causal interpretations and longitudinal insights into TPACK development over time or across different educational stages.

Moving forward, future research should consider longitudinal studies to track TPACK development among pre-service teachers across multiple years and educational stages. Comparative studies across different countries or educational systems could provide information on cultural and contextual influences on technology integration in education. Qualitative research methods could complement quantitative findings by exploring pre-service teachers' perceptions and strategies related to TPACK development in-depth. Intervention studies are also needed to evaluate the effectiveness of targeted professional development or curriculum enhancements in enhancing TPACK competencies. Furthermore, exploring the function of the differential elements in diverse demographic groups and validating the TPACK instrument in diverse educational settings would improve its reliability and applicability worldwide.

 

6. Conclusions

In conclusion, this study has meticulously examined the validity and reliability of a modified TPACK instrument among Indonesian pre-service teachers across diverse educational programs. Through rigorous analysis of CFA and Rasch, the instrument has demonstrated robust psychometric properties, confirming its suitability for assessing the integration of technology, pedagogy, and content knowledge. The findings of the CFA underscored the strong model fit of the instrument and the alignment of questionnaire items with the underlying dimensions of TPACK, as evidenced by satisfactory factor loadings across seven latent variables. Additionally, Rasch analysis provided further validation by indicating effective item measurement without significant DIF across different pre-service teacher groups, ensuring unbiased assessments. Reliability analysis consistently showed high internal consistency across all TPACK dimensions, reflecting the instrument's stability in evaluating pre-service teachers' technological competencies. The study also revealed nuanced differences in TPACK proficiency among pre-service teachers specializing in elementary, preschool, and mathematics education, highlighting specific strengths and areas for improvement within each group. The correlation and variance analyses elucidated strong relationships between the TPACK variables and identified key factors influencing the development of TPACK in educational programmes. This research lays a solid groundwork for future research to validate TPACK's empirical attributes across diverse educational levels and contexts. Furthermore, the developed questionnaire holds promise for research that investigates the factors influencing technological pedagogical content knowledge.

 

Authors’ Contribution

Komarudin Komarudin: Writing - Original Draft Supervision, Funding acquisition; Suherman Suherman, Writing – review & editing, Conceptualization, Writing - Original Draft, Formal analysis, Methodology, Editing, and Visualization.

 

Funding Agency

The Research and Community Service Department (LP2M) funded the reported study at Universitas Islam Negeri Raden Fatah Palembang, Indonesia.

 

References

Abedi, E. A., Prestridge, S., & Hodge, S. (2024). Teachers’ beliefs about technology integration in Ghana: A qualitative study of teachers’, headteachers’ and education officials’ perceptions. Education and Information Technologies, 29(5), 5857–5877. https://doi.org/10.1007/s10639-023-12049-0

Almazroa, H., & Alotaibi, W. (2023). Teaching 21st century skills: Understanding the depth and width of the challenges to shape proactive teacher education programmes. Sustainability, 15(9), 7365. https://doi.org/10.3390/su15097365

Ardiç, M. A., & Isleyen, T. (2017). High School Mathematics Teachers’ Levels of Achieving Technology Integration and In-Class Reflections: The Case of Mathematica. Universal Journal of Educational Research, 5(n12B), 1–17. https://doi.org/10.13189/ujer.2017.051401

Baran, E., Canbazoglu Bilici, S., Albayrak Sari, A., & Tondeur, J. (2019). Investigating the impact of teacher education strategies on preservice teachers’ TPACK. British Journal of Educational Technology, 50(1), 357–370. https://doi.org/10.1111/bjet.12565

Bolyard, J., Curtis, R., & Cairns, D. (2024). Learning to Struggle: Supporting Middle-grade Teachers’ Understanding of Productive Struggle in STEM Teaching and Learning. Canadian Journal of Science, Mathematics and Technology Education, 1–16. https://doi.org/10.1007/s42330-023-00302-0

Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer. https://doi.org/10.1007/978-94-007-6857-4

Bray, A., & Tangney, B. (2017). Technology usage in mathematics education research–A systematic review of recent trends. Computers & Education, 114, 255–273. https://doi.org/10.1016/j.compedu.2017.07.004

Castaño, R., Poy, R., Tomşa, R., Flores, N., & Jenaro, C. (2015). Pre-service teachers’ performance from teachers’ perspective and vice versa: Behaviors, attitudes and other associated variables. Teachers and Teaching, 21(7), 894–907. https://doi.org/10.1080/13540602.2014.995487

Elas, N., Majid, F., & Narasuman, S. (2019). Development of technological pedagogical content knowledge (TPACK) for english teachers: The validity and reliability. International Journal of Emerging Technologies in Learning (IJET), 14(20), 18–33. https://doi.org/10.3991/ijet.v14i20.11456

Elmaadaway, M. A. N., & Abouelenein, Y. A. M. (2023). In-service teachers’ TPACK development through an adaptive e-learning environment (ALE). Education and Information Technologies, 28(7), 8273–8298. https://doi.org/10.1007/s10639-022-11477-8

Flores-Castro, E., Campos-Nava, M., Ramirez-Diaz, M. H., & Moreno-Ramos, J. (2024). The Construction of Knowledge for the Teaching of Sciences: A Reflection Seen From the Pedagogical Content Knowledge (PCK). Kurdish Studies, 12(1), 3536–3555. https://doi.org/10.58262/ks.v12i1.251

Fütterer, T., Scherer, R., Scheiter, K., Stürmer, K., & Lachner, A. (2023). Will, skills, or conscientiousness: What predicts teachers’ intentions to participate in technology-related professional development? Computers & Education, 198, 104756. https://doi.org/10.1016/j.compedu.2023.104756

Guillén-Gámez, F. D., Gómez-García, M., & Ruiz-Palmero, J. (2024). Digital competence in research work: Predictors that have an impact on it according to the type of university and gender of the Higher Education teacher:[Digital competence in research work: predictors that have an impact on it according to the type of university and gender of the Higher Education teacher]. Pixel-Bit. Revista de Medios y Educación, 69, 7–34. https://doi.org/10.12795/pixelbit.99992

Hall, J. A., Lei, J., & Wang, Q. (2020). The first principles of instruction: An examination of their impact on preservice teachers’ TPACK. Educational Technology Research and Development, 68, 3115–3142. https://doi.org/10.1007/s11423-020-09866-2

Haşlaman, T., Atman Uslu, N., & Mumcu, F. (2024). Development and in-depth investigation of pre-service teachers’ digital competencies based on DigCompEdu: A case study. Quality & Quantity, 58(1), 961–986. https://doi.org/10.1007/s11135-023-01674-z

Hill, H. C., Ball, D. L., & Schilling, S. G. (2008). Unpacking pedagogical content knowledge: Conceptualizing and measuring teachers’ topic-specific knowledge of students. Journal for Research in Mathematics Education, 39(4), 372–400. https://doi.org/10.5951/jresematheduc.39.4.0372

Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118

Irmak, M., & Yilmaz Tüzün, Ö. (2019). Investigating pre-service science teachers’ perceived technological pedagogical content knowledge (TPACK) regarding genetics. Research in Science & Technological Education, 37(2), 127–146. https://doi.org/10.1080/02635143.2018.1466778

Kimmons, R., Miller, B. G., Amador, J., Desjardins, C. D., & Hall, C. (2015). Technology integration coursework and finding meaning in pre-service teachers’ reflective practice. Educational Technology Research and Development, 63(6), 809–829. https://doi.org/10.1007/s11423-015-9394-5

Kind, V. (2009). Pedagogical content knowledge in science education: Perspectives and potential for progress. Studies in Science Education, 45(2), 169–204. https://doi.org/10.1080/03057260903142285

Koh, J. H. L. (2019). TPACK design scaffolds for supporting teacher pedagogical change. Educational Technology Research and Development, 67(3), 577–595. https://doi.org/10.1007/s11423-018-9627-5

Komarudin, K., Sari, E., Rinaldi, A., Laksono, P. J., Anggara, B., Ali, M., Sholeh, M., & Wigati, I. (2024). STEM-based digital pocket book: The design and implementation of students’ mathematical communication ability. AIP Conference Proceedings, 3058(1). https://doi.org/10.1063/5.0200927

König, J., Bremerich-Vos, A., Buchholtz, C., & Glutsch, N. (2020). General pedagogical knowledge, pedagogical adaptivity in written lesson plans, and instructional practice among preservice teachers. Journal of Curriculum Studies, 52(6), 800–822. https://doi.org/10.1080/00220272.2020.1752804

Lautenbach, F., & Heyder, A. (2019). Changing attitudes to inclusion in preservice teacher education: A systematic review. Educational Research, 61(2), 231–253. https://doi.org/10.1080/00131881.2019.1596035

Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating technology into teaching and learning: Knowns, unknowns, and ways to pursue better questions and answers. Review of Educational Research, 77(4), 575–614. https://doi.org/10.3102/003465430730992

Li, M., Noori, A. Q., & Li, Y. (2023). Development and validation of the secondary mathematics teachers’ TPACK scale: A study in the Chinese context. Eurasia Journal of Mathematics, Science and Technology Education, 19(11), em2350. https://doi.org/10.29333/ejmste/13671

Luik, P., Taimalu, M., Naruskov, K., & Kalk, K. (2024). Does knowledge according to the TPACK framework have an impact on student teachers’ beliefs? A path analysis. Education and Information Technologies, 1–22. https://doi.org/10.1007/s10639-024-12767-z

Malik, S., Rohendi, D., & Widiaty, I. (2019). Technological pedagogical content knowledge (TPACK) with information and communication technology (ICT) integration: A literature review. 5th UPI International Conference on Technical and Vocational Education and Training (ICTVET 2018), 498–503. https://doi.org/10.2991/ictvet-18.2019.114

Martin, D. A., Carey, M. D., McMaster, N., & Clarkin, M. (2024). Assessing primary school preservice teachers’ confidence to apply their TPACK in specific categories of technologies using a self-audit survey. The Australian Educational Researcher. https://doi.org/10.1007/s13384-023-00669-x

Maskur, R., Suherman, S., Andari, T., Sri Anggoro, B., Muhammad, R. R., & Untari, E. (2022). The Comparison of STEM approach and SSCS learning model for secondary school-based on K-13 curriculum: The impact on creative and critical thinking ability. Revista de Educación a Distancia, 22(70), 1–26. https://doi.org/10.6018/red.507701

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x

Nordin, H., Davis, N., & Ariffin, T. F. T. (2013). A case study of secondary pre-service teachers’ technological pedagogical and content knowledge mastery level. Procedia-Social and Behavioral Sciences, 103, 1–9. https://doi.org/10.1016/j.sbspro.2013.10.300

Ong, Q. K. L., & Annamalai, N. (2024). Technological pedagogical content knowledge for twenty-first century learning skills: The game changer for teachers of industrial revolution 5.0. Education and Information Technologies, 29(2), 1939–1980. https://doi.org/10.1007/s10639-023-11852-z

Park, H., & Scanlon, D. (2024). General educators’ perceptions of struggling learners in an inaugural project-based learning Capstone. International Journal of Inclusive Education, 1–29. https://doi.org/10.1080/13603116.2024.2310673

Potter, K., Kniss, J., Riesenfeld, R., & Johnson, C. R. (2010). Visualizing Summary Statistics and Uncertainty. Computer Graphics Forum, 29(3), 823–832. https://doi.org/10.1111/j.1467-8659.2009.01677.x

Redecker, C., & Punie, Y. (2017). Digital competence of educators. Edited by Yves Punie. https://doi.org/10.2760/159770

Reyes Jr, V. C., Reading, C., Doyle, H., & Gregory, S. (2017). Integrating ICT into teacher education programs from a TPACK perspective: Exploring perceptions of university lecturers. Computers & Education, 115, 1–19. https://doi.org/10.1016/j.compedu.2017.07.009

Roussinos, D., & Jimoyiannis, A. (2019). Examining primary education teachers’ perceptions of TPACK and the related educational context factors. Journal of Research on Technology in Education, 51(4), 377–397. https://doi.org/10.1080/15391523.2019.1666323

Saubern, R., Urbach, D., Koehler, M., & Phillips, M. (2020). Describing increasing proficiency in teachers’ knowledge of the effective use of digital technology. Computers & Education, 147, 103784. https://doi.org/10.1016/j.compedu.2019.103784

Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009). Technological pedagogical content knowledge (TPACK): The development and validation of an assessment instrument for preservice teachers. Journal of Research on Technology in Education, 42(2), 123–149. https://doi.org/10.1080/15391523.2009.10782544

Smith, P. G., & Zelkowski, J. (2023). Validating a TPACK instrument for 7–12 mathematics in-service middle and high school teachers in the United States. Journal of Research on Technology in Education, 55(5), 858–876. https://doi.org/10.1080/15391523.2022.2048145

Sofyan, S., Habibi, A., Sofwan, M., Yaakob, M. F. M., Alqahtani, T. M., Jamila, A., & Wijaya, T. T. (2023). TPACK–UotI: The validation of an assessment instrument for elementary school teachers. Humanities and Social Sciences Communications, 10(1), 1–7. https://doi.org/10.1057/s41599-023-01533-0

Stoilescu, D. (2015). A critical examination of the technological pedagogical content knowledge framework: Secondary school mathematics teachers integrating technology. Journal of Educational Computing Research, 52(4), 514–547. https://doi.org/10.1177/0735633115572285

Suherman, S. (2018). Ethnomathematics: Eksploration of Traditional Crafts Tapis Lampung as Ilustration of Science, Technology, Engineering, and Mathematics (STEM). Eduma: Mathematics Education Learning and Teaching, 7(2), 21–30. https://doi.org/10.24235/eduma.v7i2.3085

Suherman, S., & Vidákovich, T. (2024). Relationship between ethnic identity, attitude, and mathematical creative thinking among secondary school students. Thinking Skills and Creativity, 51, 101448. https://doi.org/10.1016/j.tsc.2023.101448

Supriadi, N., Jamaluddin Z, W., & Suherman, S. (2024). The role of learning anxiety and mathematical reasoning as predictor of promoting learning motivation: The mediating role of mathematical problem solving. Thinking Skills and Creativity, 52, 101497. https://doi.org/10.1016/j.tsc.2024.101497

Tondeur, J., Scherer, R., Siddiq, F., & Baran, E. (2020). Enhancing pre-service teachers’ technological pedagogical content knowledge (TPACK): A mixed-method study. Educational Technology Research and Development, 68, 319–343. https://doi.org/10.1007/s11423-019-09692-1

Wicaksono, A. G. C., & Korom, E. (2023). Attitudes towards science in higher education: Validation of questionnaire among science teacher candidates and engineering students in Indonesia. Heliyon, 9(9). https://doi.org/10.1016/j.heliyon.2023.e20023

Zelkowski, J., Gleason, J., Cox, D. C., & Bismarck, S. (2013). Developing and validating a reliable TPACK instrument for secondary mathematics preservice teachers. Journal of Research on Technology in Education, 46(2), 173–206. https://doi.org/10.1080/15391523.2013.10782618