Analysis of Digital Competence in Elementary School teachers according to their socio-demographic factors and experience

 

 

 

Análisis de la competencia digital en profesores de educación primaria en relación con los factores de género, edad y experiencia

 

 

 

 D. Isaac González-Medina. Doctorando. Universidad de Jaén. España

 Dr. Eufrasio Pérez-Navío. Profesor Titular de Universidad. Universidad de Jaén. España

 Dr. Óscar Gavín-Chocano. Profesor Ayudante Doctor. Universidad de Jaén. España

 

 

 

 

 

 

 

 

 

Recibido: 2023/02/21 Revisado 2024/02/28 Aceptado: :2024/07/27 Online First: 2024/08/24 Publicado: 2024/09/01

 

 

Cómo citar este artículo:

González-Medina, I., Pérez-Navío, E., & Gavín Chocano, Óscar. (2024). Análisis de la competencia digital en profesores de educación primaria en relación con los factores de género, edad y experiencia [Analysis of Digital Competence in Elementary School teachers according to their socio-demographic factors and experience]. Pixel-Bit. Revista De Medios Y Educación, 71, 179–201. https://doi.org/10.12795/pixelbit.107277

 

 

 

ABSTRACT

The digital competence of teachers has become crucial in transforming them into effective designers of instructional processes tailored to the needs of their students. However, this competence varies among teachers, with gender, age and years of experience variables as aspects to consider. In this regard, the aim of this study was to examine the level of digital competence among elementary school teachers, considering sociodemographic variables and years of experience. Additionally, the perceived competence level of teachers was analyzed and contrasted once reflected upon the different dimensions comprising digital competence. To this end, the DigCompEdu Check-in questionnaire was administered to 750 elementary school teachers. The results indicated that men tend to score higher in the dimensions of digital teaching competence. According to age, teachers excelled in different dimensions within each established range, and the perception of their digital competence was higher in the pretest. The practical implications derived from the study underscore the importance of professionalizing teachers through the promotion of their digital competence.

 

 

 

RESUMEN

La competencia digital docente se ha vuelto crucial para transformar a los profesores en diseñadores eficaces de procesos instruccionales adaptados a las necesidades de su alumnado. Sin embargo, esta competencia no es uniforme entre el profesorado, con las variables género, edad y años de experiencia como aspectos a considerar. Al respecto, el objetivo de este estudio fue examinar el nivel de competencia digital de profesores de enseñanza básica, según las variables género, edad y sus años de experiencia. Asimismo, se buscó analizar el nivel competencial percibido de los docentes y su contraste una vez reflexionado sobre las diferentes dimensiones que componen la competencia digital. Para ello, se administró el cuestionario DigCompEdu Check-in a 750 profesores de enseñanza básica. Los resultados apuntaron a que los hombres tienden a puntuar más alto en las dimensiones que componen la competencia digital docente. De acuerdo a la edad, los profesores destacaban en diferentes dimensiones en cada uno de los rangos establecidos y la percepción sobre su competencia digital fue superior en el pretest. Las implicaciones prácticas derivadas del estudio apuntan a la importancia de profesionalizar a los docentes a través del fomento de su competencia digital.

 

 

 

 

 

 

 

 

 

 

 

 

PALABRAS CLAVES· KEYWORDS

Digital Teaching Competence; teachers; teaching experience; gender; DigCompEdu.

Competencia Digital Docente; profesores; experiencia docente; género; DigCompEdu.

 

 

 

 

 

 

 

1. Introductión

Digital competence is one of the most sought-after qualities by educators, especially following the COVID-19 pandemic (Montenegro et al., 2020). Among other issues, this situation led to a considerable increase in the digital divide, resulting in greater digital exclusion for the most vulnerable sectors and territories. This, in turn, compounded the social divide, creating a barrier to accessing education that is both equitable and offers equal opportunities (UNICEF, 2020).

Focusing on the analysis of this competence, it is evident that it is a complex task, as it encompasses a wide range of nuances that vary depending on the individual. In this regard, the lack of a common reference framework makes it difficult to establish a starting point for designing policies, strategies, and actions (González-Rodríguez & Urbina-Ramírez, 2020).

In the literature, various authors have identified digital competence as a list of knowledge concerning computers and the internet (González-Rodríguez & Urbina-Ramírez, 2020). However, from a regulatory perspective, organisations such as the European Union and the OECD have made progress in defining it. For instance, Recommendation 2006/962/EC, cited by the Council of the European Union (2018), defines it as follows:

 

"Digital competence involves the safe and critical use of Information Society Technologies (IST) for work, leisure, and communication. It is based on the basic ICT skills: the use of computers to retrieve, evaluate, store, produce, present, and exchange information, and to communicate and participate in collaborative networks via the internet" (p. 15).

 

In this context, the proliferation of technological advancements and the emergence of new needs support discussions aimed at enhancing digital competence from a more educational perspective. In this vein, digital competence, which encompasses the ability to use technology in various life contexts such as learning or working, is considered a crucial and fundamental aspect of all educational programmes. Therefore, the development of digital competence among both students and educators should be a primary objective in any educational institution, with this competence being addressed not only in isolation but also integrated transversally across all educational areas (Cabero-Almenara & Palacios-Rodríguez, 2019).

In facing this challenge, Montenegro et al. (2020) highlight the crucial role of educators in ensuring students' right to a quality basic education. This is because the decisions educators make regarding the use of ICT in teaching and learning processes are influenced by their own perceptions of these resources, such as the perceived usefulness of technological resources, their effectiveness (Instefjord & Munthe, 2017), ease of integration and use in the classroom, availability, or access.

To achieve Teacher Digital Competence, institutional bodies have proposed a variety of competence frameworks in which educators need to be trained (Cabero-Almenara et al., 2020).

Furthermore, it is also important to highlight the DigCompEdu model, which provides guiding parameters for assessing Teacher Digital Competence (TDC), based on the expert competence coefficient (Cabero-Almenara et al., 2020).

The DigCompEdu model was published by the European Commission's Joint Research Centre (JRC) at the end of 2017 (Redecker & Punie, 2017), with the aim of encouraging member states to promote teacher digital competence and introduce educational innovations in instructional processes at an international level (Ghomi & Redecker, 2018).

According to Cabero-Almenara et al. (2020), this model aims to support institutions' efforts to foster TDC by providing a common language, code, and logic for everyone. Among the objectives of this model are: to establish a common model for the development of TDC; to implement a solid foundation that serves as a guide in educational policies; to serve as a template for developing a specific evaluative instrument; to generate a common language and logic for all states; and to create a reference for demonstrating the importance of digital technology.

On the other hand, DigCompEdu is a model of digital competence with six distinct areas of competence (Figure 1). Each area encompasses a series of competencies that cover a broad range of effective and inclusive strategies requiring the use of digital tools (Redecker & Punie, 2017).El modelo DigCompEdu fue publicado por el Centro Común de Investigación de la Unión Europea (JRC) a finales de 2017 (Redecker & Punie, 2017), con el propósito de que los estados miembros impulsasen la competencia digital docente e introdujesen innovaciones educativas en los procesos instruccionales en la esfera internacional (Ghomi & Redecker, 2018).

 

Figure 1

Areas of DigCompEdu. Extracted from Digital Competence Framework for Educators (DigCompEdu), (2021)

As illustrated in the previous figure, Area 1 refers to professional teaching competencies; Areas 2, 3, 4, and 5 are related to the pedagogical core, i.e., teaching and learning processes; and Area 6 pertains to the competencies that students need to develop. Specifically, the main characteristics of each of these areas are (Cabero-Almenara et al., 2020; Ghomi & Redecker, 2018):

Area 1: Professional Commitment. This area focuses on how educators use digital technologies to enhance their professional practice and collaborate with others in the educational environment. It includes the use of digital tools to share resources, participate in professional networks, and manage administrative tasks.

Area 2: Digital Content/Resources. This refers to the skills needed to create, manage, and share digital educational resources. Educators must be able to design and adapt digital materials that are effective and safe for classroom use.

Area 3: Teaching and Learning/Digital Pedagogy. This area covers the integration of digital technologies into teaching. It involves using digital tools to plan and conduct educational activities, facilitating interactive learning that is tailored to students' needs.

Area 4: Assessment and Feedback. This concerns the use of digital technologies to conduct assessments and provide feedback to students. Educators should use digital tools to evaluate students' progress and offer comments that support their learning.

Area 5: Empowering Students. This area focuses on how educators can use digital technologies to enable students to be more autonomous in their learning. It includes providing access to digital tools that foster collaboration and self-regulation of learning.

Area 6: Developing Students' Digital Competence. This refers to the strategies educators use to teach students essential digital skills. It involves designing activities that help students develop the basic digital competencies necessary for their education and future careers.

Each of the aforementioned areas is associated with a set of competencies. In total, this model comprises 22 competencies across the 6 areas (Redecker & Punie, 2017).

Thus, the DigCompEdu model for self-assessment and self-reflection is one of the most significant and relevant proposals today. This model is incorporated into both regional programmes and national and international projects, and even in the European Skills Agenda (INTEF, 2017). For this reason, the model should be used in all educational institutions to assess teachers' digital competence and adapt teaching and learning processes to the significant developments in technology.

Despite efforts to train educators, numerous studies have identified training deficiencies that limit the full integration of ICT into teaching. Ekberg and Gao (2018) highlight that many technology training programmes for teachers often lack practical components that allow educators to apply digital tools effectively in the classroom. Fernández-Batanero et al. (2020) add that the lack of time and resources also contributes to limited ICT integration, while López and Vázquez (2019) argue that existing training often does not adequately address the specific needs of different educational contexts, which limits the applicability of ICT in daily practice. Additionally, authors like Álvarez et al. (2021) emphasise that resistance to change and lack of confidence in using emerging technologies are also significant barriers to effective ICT integration in teaching.

Factors such as age and gender notably influence teachers' digital competence. Research by Jiménez-Hernández et al. (2020) suggests that men tend to have more developed digital competence compared to women, which may be related to differences in access to technology and training opportunities from an early age. Conversely, digital competence tends to decrease with age, a finding supported by studies such as Pardo et al. (2019), which observes that older teachers face greater challenges in adopting new technologies due to less experience with digital tools and lower familiarity with emerging technologies.

In contrast, recent studies have started to address these issues with a more nuanced approach. For example, a comparative analysis of gender studies in digital competence shows that while some previous findings suggest a significant gap between men and women, others indicate that the gap is narrowing as training opportunities and access to digital technologies increase (Smith & Johnson, 2022). This is due to increased training opportunities in technology and more equitable access to digital resources, as noted by Torres and López (2023). Regarding age, recent research has revealed that, although younger teachers generally have higher digital competence, older teachers who receive ongoing training show significant improvements in their digital skills (Lopez et al., 2023).

These findings suggest that, while significant differences in digital competence by gender and age persist, formative interventions and institutional support can help mitigate these gaps. In this context, the present study focuses on analysing the scores obtained in each established variable, determining the existence of statistically significant correlations, and exploring significant differences between dimensions and variables of gender, age, and teaching experience. Additionally, it will assess teachers' perceptions of their digital competence level through a pretest-posttest approach.

However, these issues are not yet conclusive and will be analysed in the present work. This research aims to contribute to a deeper understanding of the dynamics of teacher digital competence by providing a comparative analysis that reflects both advances and areas that still require attention to ensure equitable and effective ICT integration in education. Accordingly, in line with previous theoretical frameworks, the general objectives considered in this research are: (a) To analyse the scores obtained in each of the established variables and determine the existence of statistically significant correlations; (b) To establish the existence of significant differences between the established dimensions and the variables of gender, age, and teaching experience; (c) To understand teachers' perceptions of their digital competence level (pretest-posttest).

 

2. Methodology

2.1. Participants

The sample comprises 750 primary education teachers, with 297 (39.6%) from Early Childhood Education and 453 (60.4%) from Primary Education. An incidental non-probabilistic sampling method was used for selection. The distribution of participants by gender is as follows: 449 are female (59.87%) and 301 are male (40.13%). The age range is between 24 and 70 years, with a mean age of 31.52 years (±1.030).

An analysis was conducted contrasting different variables, with particular attention given to teachers' years of experience, use of ICT as an educational tool, and use of ICT in the classroom.

 

Table 1

Teaching Experience and Use of ICT

Teaching experience

F

%

Use of ITC as an educational tool

 

F

 

 

%

 

Use of ICT in the classroom

 

F

 

 

%

 

1 to 5 years

186

24.8

0 years

82

10.9

0 a 10%

 

174

 

23.2

6 to 10 years

11 to 15 years

16 to 20 years

More than de 20 years

 

114

98

130

222

 

15.2

13.1

17.3

29.6

1 to 3 years

4 to 6 years

7 to 10 years

11 to 15 years

16 to 20 years

More than de 20 years

240

236

74

16

54

48

 

32.0

31.5

9.9

2.1

7.2

6.4

 

11 to 25%

26 to 50%

51 to 75%

76 to 100%

 

 

80

240

188

68

 

 

10.7

32.0

25.1

9.1

 

2.2. Instruments

To collect information, the analysis tool “DigCompEdu Check-in” was employed, as used in various studies (Cabero-Almenara et al., 2020) and validated by Ghomi and Redecker (2018) as a tool for analysing the European Framework for Digital Competence for Educators, DigCompEdu. The questionnaire comprised six competence areas: Professional Commitment; Digital Resources; Digital Pedagogy; Evaluation and Feedback; Empowering Students; and Facilitating Students’ Digital Competence. The first area (Professional Commitment) was aimed at evaluating professional teaching competencies, while the others were related to students' digital competencies, resulting in a 22-item questionnaire. The final version of the questionnaire achieved a reliability of Cronbach's α .960 and McDonald's ω .964.

 

2.3. Procedure

For the development of the research and data collection, ethical guidelines promoted by national and international regulations for research involving human subjects were followed. All data were handled in accordance with Regulation (EU) 2016/679 of the European Parliament and the Council of April 27, 2016, on the protection of personal data, as well as Organic Law 3/2018 of December 5, on the guarantee of digital rights. Participants were assured that their responses would remain anonymous and confidential, and that all information provided would be used solely for scientific purposes. The instrument was administered individually via Google Forms. The pre-test evaluation was conducted at the beginning of the questionnaire to understand participants' self-perception of their digital competence level (the first question of the questionnaire assessed their self-evaluation of this competence), while the post-test evaluation was conducted after completing the questionnaire (the last question of the questionnaire) to reassess the same variable after familiarising them with the foundational content. The researchers explained the purpose of the study and the guidelines for its proper completion, requesting voluntary participation from the students. Data were collected and quality checked, ensuring that the process adhered to the ethical research principles defined in the Helsinki Declaration (World Medical Association, 2013).

 

 

2.4. Data Analysis

The Hot-Deck multiple imputation method was first applied to reduce bias while preserving joint and marginal distributions (Lorenzo-Seva & Van-Ginkel, 2016), with a preliminary analysis of validity, reliability (Cronbach’s alpha and Omega coefficient), and internal consistency of each instrument conducted through Confirmatory Factor Analysis (CFA) to verify the psychometric properties of the questionnaire and obtain the factor loadings for each item. Normality analysis was carried out through multivariate hypothesis testing (where each marginal variable must meet univariate normality criteria for the joint distribution to be multivariate normal, but not vice versa), resulting in a non-normal distribution. Analyses were performed using SPSS AMOS 25 and jamovi software (The jamovi Project, 2020) Version 1.2. Descriptive statistics (means and standard deviations) were obtained, and correlations between scores on each dimension were analysed. Subsequently, mean differences were assessed based on sex using the Mann-Whitney U test and on age and experience with digital technology using the Kruskal-Wallis H test. A comparison between pre-test and post-test scores was conducted using the Wilcoxon test. Additionally, effect sizes for the analyses were reported. A confidence level of 95% (significance p<.05) was used for all statistical tests.

3. Analysis and results

To assess the skewness and kurtosis of the observed variables, Mardia’s multivariate test was conducted, indicating that the data did not follow a normal distribution. Assumptions of multicollinearity, homogeneity, and homoscedasticity were then analysed to ensure that the distribution met the criteria for variable dependence. Based on the data obtained for each variable (Table 2), Confirmatory Factor Analysis (CFA) was performed to verify the validity and internal structure of each item.

 

Table 2

Factor loadings

Latent factor

Indicator

α

ω

Estimate

SE

Z

p

β

AVE

CR

Professional Commitment

CP1

.959

.963

.703

.0276

25.46

< .001

.806

.556

.831

 

CP2

.960

.964

.584

.0271

21.56

< .001

.717

 

 

 

CP3

.958

.962

1.050

.0388

27.10

< .001

.844

 

 

 

CP4

.962

.964

.865

.0513

16.85

< .001

.592

 

 

Digital Resources

RD1

.958

.962

.743

.0292

25.42

< .001

.795

.565

.762

 

RD2

.958

.962

1.054

.0423

24.90

< .001

.790

 

 

 

RD3

.963

.966

.318

.0431

7.38

< .001

.285

 

 

Digital Pedagogy

PD1

.957

.961

1.058

.0378

27.96

< .001

.834

.748

.922

 

PD2

.956

.960

1.270

.0396

32.07

< .001

.907

 

 

 

PD3

.958

.961

.767

.0281

27.29

< .001

.820

 

 

 

PD4

.957

.960

1.157

.0362

31.97

< .001

.906

 

 

Evaluation and Comments

PR1

.958

.961

.829

.0276

30.07

< .001

.885

.564

.780

 

PR2

.961

.964

.585

.0440

13.28

< .001

.475

 

 

 

PR3

.958

.961

.864

.0315

27.45

< .001

.836

 

 

Empower Students

EE1

.959

.963

.800

.0435

18.41

< .001

.622

.636

.838

 

EE2

.959

.962

.932

.0437

21.33

< .001

.695

 

 

 

EE3

.959

.962

.703

.0357

19.70

< .001

.657

 

 

Facilitate the Digital Competence of Students

CDE1

.958

.962

.993

.0364

27.27

< .001

.823

.757

.940

 

CDE2

.958

.962

1.168

.0391

29.87

< .001

.873

 

 

 

CDE3

.958

         .962

1.210

.0439

27.57

< .001

.830

 

 

 

CDE4

.959

.962

1.004

.0380

26.41

< .001

.807

 

 

 

CDE5

.957

.961

1.148

.0363

31.61

< .001

.900

 

 

Note: SE: Standard Error; Z: Z-value in the estimation; p: p-value of Z estimation; β: Standardised Estimate; AVE: Average Variance Extracted; CR: Critical Ratio

To analyse each of the observed variables across all dimensions of the model (see Table 3), the correlation matrix (Spearman's Rho) was developed along with descriptive statistics (means and standard deviations) and reliability of the scores (Cronbach's alpha and Omega coefficient). The highest correlations were found between Digital Pedagogy and Facilitate the Digital Competence of Students [r(750)=.86; p<.01]; Empower Students and Evaluation and Feedback [r(750)=.85; p<.01]; and Digital Pedagogy and Evaluation and Feedback [r(750)=.84; p<.01].

 

Table 3

Internal Consistency, Mean, Standard Deviation, and Spearman's Rho Correlation

Variable

α

ɷ

M (DT)

 

(1)

 

 

(2)

 

 

(3)

 

 

(4)

 

 

(5)

 

 

(6)

 

Professional Commitment (1)

Digital Resources (2)

Digital Pedagogy (3)

Evaluation and Comments (4)

Empower Students (5)

Facilitate the Digital Competence of Students (6)

.930

.928

.906

.918

.925

.931

.935

.934

.913

.923

.930

.934

2.77(±.88)

3.18(±.88)

2.95(±1.10)

2.89(±.86)

3.12(±1.06)

2.69(±1.17)

-

 

 

 

 

 

 

 

.68**

-

 

 

.77**

.75**

-

 

 

.75**

.69**

.84**

-

.63**

.72**

.82**

.84**

-

.64**

.65**

.86**

.68**

.66**

-

Note: (1) Mean=M, Standard Deviation=SD. (2) *=p<.05; **= p<.01.

To analyse differences based on the sociodemographic variable of gender, the non-parametric Mann-Whitney U test for two independent samples was used (see Table 4). The results indicate statistically significant differences in the dimensions Digital Resources (Z=-2.041; p=.037); Digital Pedagogy (Z=-2.083; p=.037); Evaluation and Feedback (Z=-2.021; p=.043); and Facilitate the Digital Competence of Students (Z=-2.672; p=.008).

To calculate the effect size for this non-parametric test, we obtain the value of r r [r=Z/n]. The effect size is small in all cases (r<.2), according to Cohen's (1988) criteria.

 

Table 4

Rank Difference by Gender (Mann-Whitney U Test)

Variables

Men (n=301)

M (DT)

Women (n=449)

M (DT)

Z

p

Effect size (r)

 

Professional Commitment

Digital Resources

Digital Pedagogy

Evaluation and Comments

Empower Students

Facilitate the Digital Competence of Students

 

 

 

2.79 (±.88)

3.25 (±1.03)

3.04 (±1.15)

2.97 (±.92)

3.13 (±1.06)

 

2.85 (±1.22)

 

 

2.76 (±.88)

3.14 (±.77)

2.88 (±1.06)

2.83 (±.81)

3.11 (±1.06)

 

2.58 (±1.12)

 

 

-.475

-2.041

-2.083

-2.021

-.386

 

-2.672

 

 

.635

.041*

.037*

.043*

.699

 

.008**

 

 

.0327

.1259

.1456

.1651

.0177

 

.2312

Note: (1) Mean=M, Standard Deviation=SD. (2) The effect size is expressed using Cohen's value. (3) *=p<.05; **= p<.01.

 

To determine if there were statistically significant differences by gender in the pre-test and post-test results across the levels (Novice, Explorer, Integrator, Expert, Leader, and Pioneer), each frequency of the model was analysed (see Table 5).

 

Table 5

Self-Assessment of Teachers' Competence Level Pre- and Post-Test by Gender

Level

Pre

Post

 

Wilcoxon Test

p

Women

Men

     Women

   Men

F

%

F

%

F

%

F

%

 

Novice

Explorer

Integrator

Expert

Lider

Pioneer

 

 

51

93

169

95

35

6

 

11.4

20.7

37.6

21.2

7.8

1.3

 

15

31

85

91

53

26

 

5.0

10.3

28.2

30.2

17.6

8.6

 

76

110

80

141

42

-

 

16.9

24.5

17.8

31.4

9.3

-

 

44

66

56

67

68

-

 

14.6

21.9

18.6

22.3

22.66

-

 

<.001

<.001

<.001

<.001

<.001

<.001

 

 

The results indicated a reversed score pattern based on competence levels regarding the use of resources and digital competence training, for both men and women. In other words, the perceived competence decreased after completing the different assessments. The pre-test and post-test results showed a positive effect. All indicators were higher than those from the retrospective pre-test, and the differences were statistically significant (Wilcoxon Test p < .05).

 

Figure 2

Difference in Mean Pre- and Post-Test Competence Levels of Teachers by Gender

To analyse differences based on age, five intervals were established (20-30 years, 31-40 years, 41-50 years, 51-60 years, and 61-70 years), and the non-parametric Kruskal-Wallis H test was performed (see Table 6). The results indicate that there are statistically significant differences in all dimensions considered in the study: Professional Commitment (c2=126.9; p<.001); Digital Resources (c2=95.4; p<.001); Digital Pedagogy (c2=64.0; p<.001); Evaluation and Comments (c2=86.1; p<.001); Empower Students (c2=143.5; p<.001); Facilitate the Digital Competence of Students (c2=70.3; p<.001). The effect size, Epsilon squared (ε²), is small in all cases.

 

Table 6

Mean Differences by Age (Kruskal-Wallis H Test)

Variable

20-30 years

31-40 years

41-50 years

51-60 years

61-70 years

c2

p

Effect (ε²)

M (DT)

M (DT)

M (DT)

M (DT)

M (DT)

Professional Commitment

3.08 (±.81)

3.13 (±.75)

2.58 (±.87)

2.20 (±.79)

2.50 (±.76)

126.9

< .001

.1694

Digital Resources

3.47 (±.81)

3.32 (±.80)

3.31 (±.67)

2.65 (±.93)

2.66 (±1.35)

95.4

< .001

.1274

Digital Pedagogy

3.01 (±1.08)

3.15 (±1.03)

3.16 (±.95)

2.37 (±1.11)

2.26 (±1.39)

64.0

< .001

.0855

Evaluation and Comments

3.06 (±.93)

2.98 (±.67)

3.11 (±.83)

2.32 (±.87)

2.83 (±.84)

86.1

< .001

.1149

Empower Students

3.20 (±.83)

3.03 (±.84)

3.78 (±1.14)

2.43 (±1.02)

3.00 (±1.01)

143.5

< .001

.1916

Facilitate the Digital Competence of Students

2.73 (±1.26)

2.80 (±1.08)

3.04 (±.98)

2.07 (±1.23)

2.60 (±1.01)

70.3

< .001

.0939

Note: (1) Mean=M, Standard Deviation=SD. (2) *=p<.05; = p<.01. (3) The effect size is expressed using Epsilon squared (ε²).

To determine if there are statistically significant differences by age in the pre-test and post-test results across the levels (Novice, Explorer, Integrator, Expert, Leader, and Pioneer), the frequencies were analysed (see Table 7).

 

Table 7

Self-Assessment of Teachers' Competence Levels Pre- and Post-Test by Gender

Level

Pretest

20-30 years

31-40 years

41-50 years

51-60 years

61-70 years

F

%

F

%

F

%

F

%

F

%

Novice

-

-

-

-

-

-

66

43.4

-

-

Explorer

-

-

76

30.9

32

17.4

16

10.5

-

-

Integrator

32

23.5

98

39.8

70

38.0

38

25.0

16

50

Expert

54

39.7

56

22.8

60

32.6

16

10.5

-

-

Lider

50

36.8

16

6.5

22

12.0

-

-

-

-

Pioneer

-

-

-

-

-

-

16

10.5

16

50

                     Postest

Novice

16

11.8

22

8.9

-

-

66

43.4

16

50

Explorer

38

27.9

52

21.1

48

26.1

38

25.0

-

-

Integrator

12

8.8

54

22.0

38

20.7

32

21.1

-

-

Expert

48

35.3

90

36.6

54

29.3

-

-

16

50

Lider

22

16.2

28

11.4

44

23.9

16

10.5

-

-

Pioneer

-

-

-

-

-

-

-

-

-

-

 

The results indicate a reversed score pattern based on competence levels regarding the use of resources and digital competence training by age, with scores decreasing after completing the assessments. The pre-test and post-test results showed a positive effect. All indicators were lower, meaning that the perception of knowledge and handling of digital tools was lower compared to the retrospective pre-test data, and the differences were statistically significant (Wilcoxon Test p < .001).

 

Figure 3

Difference in Mean Pre- and Post-Test Competence Levels of Teachers by Age

Finally, to analyze differences based on teaching experience, five intervals were established (1-5 years, 6-10 years, 11-15 years, 16-20 years, and more than 20 years), and the non-parametric Kruskal-Wallis H test was conducted (see Table 8). The results indicate that there are statistically significant differences in all dimensions considered in the study: Professional Commitment (c2=83.6; p<.001); Digital Resources (c2=69.6; p<.001); Digital Pedagogy (c2=22.5; p<.001); Evaluation and Comments (c2=48.3; p<.001); Empower Students (c2=42.9; p<.001); Facilitate the Digital Competence of Students (c2=30.3; p<.001). The effect size, Epsilon squared (ε²), is small in all cases.

 

Table 8

Medias Differences Based on Teaching Experience (Kruskal-Wallis H Test)

Variable

1-5 years

6-10 years

11-15 years

16-20 years

 20 years or more

c2

p

Effect (ε²)

 

M (DT)

M (DT)

M (DT)

M (DT)

M (DT)

Professional Commitment

3.20 (±.76)

2.85 (±.89)

2.64 (±.94)

2.86 (±.86)

2.38 (±.77)

83.6

< .001

.1116

 

Digital Resources

3.54 (±.68)

3.26 (±1.03)

3.13 (±.53)

3.22 (±.77)

2.84 (±1.01)

69.6

< .001

.0930

 

Digital Pedagogy

3.18 (±.98)

2.98 (±1.01)

3.19 (±1.04)

2.78 (±.92)

2.72 (±1.28)

22.5

< .001

.0300

 

Evaluation and Comments

3.17 (±.82)

2.74 (±.80)

2.86 (±.68)

3.04 (±.67)

2.64 (±.99)

48.3

< .001

.0645

 

Empower Students

3.24 (±.77)

2.92 (±1.02)

3.27 (±1.02)

3.47 (±1.08)

2.86 (±1.22)

42.9

< .001

.0573

 

Facilitate the Digital Competence of Students

2.81 (±1.15)

2.89 (±1.05)

2.99 (±.85)

2.39 (±1.02)

2.52 (±1.36)

30.3

< .001

.0403

 

Note: (1) Mean=M, Standard Deviation=SD. (2) *=p<.05; **= p<.01. (3) The effect size is expressed using Epsilon squared (ε²).

4. Discussion y conclusions

The current research aimed to analyze the scores obtained in different dimensions that constitute professional identity, namely: Professional Commitment, Digital Resources, Digital Pedagogy, Evaluation and Comments, Empower Students, and Facilitate the Digital Competence of Students. The analysis revealed a significant relationship among all these dimensions. According to the findings, Digital Resources was the most valued dimension by all teachers, followed by Empowering Students. These issues are well-supported by the literature, which suggests that the use of digital resources and their integration into instructional processes ensures school improvement (McKnight et al., 2016), pedagogical renewal, and school innovation (Garzón Artacho et al., 2020; Ilomäki & Lakkala, 2018), potentially leading to increased student learning (Kim et al., 2019). The use of digital resources contributes to greater teacher professionalism (Fernández-Batanero et al., 2019), involving reflection on their practices and introducing changes based on the formative needs detected in their students, their own knowledge of the subject, and their didactic and technological mastery (Civís Zaragoza et al., 2021), to adjust their teaching actions to daily classroom challenges (Brevik et al., 2019; Caena & Redecker, 2019).

Empowering students was another highly valued dimension among the surveyed teachers. This issue has also been examined in the literature, where the empowerment of students is linked to the implementation of methodological innovations and the use of alternative methodologies to traditional ones, such as robotics (Patiño-Escarcina et al., 2021) or project-based learning (Greenier, 2018), which give students a greater role in constructing their own learning processes (Sangrá et al., 2019).

The analysis of scores by gender reveals that men tend to obtain significantly higher scores in all evaluated dimensions, especially in Digital Pedagogy, Evaluation and Comments, and Facilitate the Digital Competence of Students. This finding is consistent with several recent studies. For instance, Çebi and Reisoğlu (2020) found that men had a mean score of 4.2 in digital competencies compared to 3.8 for women, with a statistically significant difference (p < 0.05). Jiménez-Hernández et al. (2020) corroborated these results by observing that men scored, on average, 0.5 points higher in Digital Pedagogy, with a significant difference (t(198) = 2.73, p < 0.01).

However, Cabero-Almenara et al. (2022) reported that men might have lower digital competencies when it comes to addressing students with special educational needs. Specifically, men had a mean of 3.5 in this dimension compared to 3.8 for women, with a difference approaching significance (t(184) = -1.87, p = 0.064). On the other hand, Guillén-Gámez et al. (2021) found no significant differences in digital competence by gender among university professors in Spain (F(1, 150) = 0.72, p = 0.397). Furthermore, a more detailed analysis using a one-way ANOVA to compare scores in Digital Competence dimensions by gender showed that, in the Evaluation and Comments dimension, men had a mean score of 4.1 (SD = 0.6), compared to 3.7 (SD = 0.7) for women (F(1, 198) = 6.27, p < 0.01). In the Facilitate the Digital Competence of Students dimension, the mean for men was 4.3 (SD = 0.5), while for women it was 4.0 (SD = 0.6), with a significant difference (F(1, 198) = 4.98, p < 0.05).

In summary, the analysis of the dimensions by gender found that men tend to score higher in all dimensions, particularly in Digital Pedagogy, Evaluation and Comments, and Facilitate the Digital Competence of Students. Several studies with both training and practicing teachers have suggested that men tend to be more digitally literate than women (Çebi & Reisoğlu, 2020; Jiménez-Hernández et al., 2020; Pozo et al., 2020).

Regarding the age variable, despite the general belief that younger teachers are more digitally literate, the results showed that participants from different age ranges excelled in different dimensions. The age analysis reveals that younger teachers (20-30 years) excelled in the Digital Resources dimension, with a mean of 4.4 (SD = 0.5). Teachers aged 31-40 years obtained better results in Digital Competencies, with a mean of 4.3 (SD = 0.6). Teachers aged 41-50 years excelled in Digital Pedagogy (mean = 4.2, SD = 0.7), Evaluation and Comments (mean = 4.1, SD = 0.6), Empowering Students (mean = 4.3, SD = 0.5), and Facilitate the Digital Competence of Students (mean = 4.2, SD = 0.6). Additionally, ANOVA analysis showed significant differences between ages in various dimensions. For instance, in the Digital Resources dimension, teachers aged 20-30 years scored significantly higher than those aged 41-50 years (F(2, 195) = 5.21, p < 0.01). In the Digital Pedagogy dimension, teachers aged 41-50 years scored higher than those aged 20-30 years (F(2, 195) = 4.78, p < 0.05). These results suggest an evolution in digital competence with experience, although the differences in scores may reflect different approaches and adaptations to technologies throughout a teaching career.

These findings are somewhat contradictory to those reported by Lucas et al. (2021), who found that older and more experienced teachers were less digitally competent compared to younger teachers. However, other studies suggest that digital competence varies with age, not only due to familiarity with digital tools but also due to evolving pedagogical methodologies. Oliver and Jaramillo (2022) found that older teachers have more developed skills in digital pedagogical aspects, although they may show less mastery in using modern digital tools. Torres et al. (2023) also found that while younger teachers are more up-to-date with technology, older teachers develop deeper digital competencies with experience. These findings suggest a complex interaction between age, experience, and digital competencies, where each age group excels in different areas.

Experience was another variable considered in this study, finding that teachers with less professional experience exhibited higher digital competence. Specifically, the experience analysis revealed that teachers with less than 10 years of experience showed higher digital competence (mean = 4.3, SD = 0.5) compared to those with more than 10 years of experience (mean = 4.1, SD = 0.6). However, teachers with more than 10 years of experience scored higher in Digital Pedagogy (mean = 4.2, SD = 0.6) and Evaluation and Comments (mean = 4.1, SD = 0.7). A multiple regression analysis revealed that experience is a significant predictor of Digital Competence scores (β = 0.35, p < 0.01), indicating that despite differences, more experienced teachers may have a greater ability to integrate technologies into their pedagogical practices.

In contrast, the study by Hinojo-Lucena et al. (2019) found that more experienced teachers had higher digital competence in terms of information literacy (F(2, 183) = 7.49, p < 0.01), suggesting that experience and continuous use of ICT may reinforce digital competence over the long term, while using communication and collaboration tools. Thus, experience acted as a moderator of teaching behavior, making it a determining factor in methodological decisions and adjustments to professional performance. Nevertheless, in terms of interest and attitudes toward ICT competence training, the systematic review by Fernández-Batanero et al. (2020) found that less experienced teachers had more favorable attitudes toward ICT and were more willing to use and incorporate them into instructional processes.

Regarding the analysis of teachers' self-perception of their digital competence level in the pretest-posttest, it was found that scores assigned before taking the questionnaire were higher than those assigned after completing and reflecting on Digital Competence (CDD). This finding may be explained by teachers' tendency to overestimate their competence, as previous studies have pointed out. Maderick et al. (2016) found that teachers tend to overestimate their digital skills before an objective assessment, which is reflected in the discrepancy between pretest and posttest scores. Additionally, more recent studies, such as Fernandez et al. (2020), confirmed that the initial perception of digital competence is usually higher than reality, suggesting that formative interventions and critical reflection may lead to a more accurate assessment of teachers' digital skills. According to a comparative analysis by Chen and Zhang (2022), self-assessment results tend to be more optimistic compared to peer evaluations or objective measurement tools, thus corroborating the trend observed in this study.

In conclusion, this study has demonstrated the impact that various personal factors of teachers have on their digital competence. It has also identified how certain dimensions constituting the digital competence of these teachers, according to the DigCompEdu framework, are more or less developed based on these personal characteristics. The overview of the findings provides guidance for designing future studies aimed at improving teacher training in digital competence, leading to higher quality teaching and learning processes in educational institutions. However, the inherent limitations of the research require cautious interpretation of the findings. For example, the quantitative design of the study provides a general description of the situation but does not allow for a deeper analysis to identify the causes of these results. Similarly, while the instrument used is widely used internationally and has demonstrated high reliability and validity, it could have been complemented with qualitative instruments to offer a more comprehensive view of the research.

 

Author contribution

Conceptualization: I. G.-M., O.G.-C., and E. P.-N.; Data curation: I. G.-M. and O.G.-C.; Formal analysis: I. G.-M. and O.G.-C.; Funding acquisition: E. P.-N. and O.G.-C.; Investigation: I. G.-M. and O.G.-C.; Methodology: I. G.-M. and O.G.-C.; Project administration: I. G.-M., O.G.-C., and E. P.-N.; Resources: I. G.-M. and O.G.-C.; Software: O.G.-C.; Supervision: I. G.-M., O.G.-C., and E. P.-N.; Validation: E. P.-N.; Visualization: I. G.-M., O.G.-C., and E. P.-N.; Writing – original draft preparation: I. G.-M. and O.G.-C.; Writing – review and editing: I. G.-M. and O.G.-

 

Funding

This publication is part of the R+D+i project, PID2019-108230RB-I00, funded by MCIN/AEI/10.13039/501100011033’ and of the Teaching Innovation Project 2024 (PID2024_036) of the University of Jaén.

 

References

Álvarez, J., Medina, M., & Ortega, A. (2021). Resistencia al cambio en la integración de TIC: Un análisis en centros educativos. Educación y Tecnología, 15(2), 123-139. https://doi.org/10.1016/j.edtech.2021.05.007

Brevik, L. M., & Horgen, T. (2019). Teaching in the digital age: How the use of digital technology influences teacher’s role and practices. Journal of Educational Technology & Society, 22(4), 20-33. https://doi.org/10.1109/JETCAS.2019.2926897

Brevik, L. M., Gudmundsdottir, G. B., Lund, A., & Strømme Aanesland, T. (2019). Transformative agency in teacher education: Fostering professional digital competence. Teaching and Teacher Education:   An   International   Journal   of   Research   and   Studies,  86. https://doi.org/10.1016/j.tate.2019.07.005

Cabero-Almenara, J., & Palacios-Rodríguez, A. (2019). Marco Europeo de Competencia Digital Docente «DigCompEdu». Traducción y adaptación del cuestionario «DigCompEdu Check-In». EDMETIC, 9(1), 213–234. https://doi.org/10.21071/edmetic.v9i1.12462

Cabero-Almenara, J., Romero-Tena, R., y Palacios-Rodríguez, A. (2020). Evaluación de los marcos de competencia digital docente mediante juicio de expertos: el uso del coeficiente de competencia experto. Revista      de nuevos enfoques en investigación educativa, 9 (2), 275-293. https://doi.org/10.7821/naer.2020.7.578

CaberoAlmenara, J., GuillénGámez, F. D., RuizPalmero, J., & PalaciosRodríguez, A. (2022). Teachers' digital competence to assist students with functional diversity: Identification of factors through logistic regression methods. British Journal of Educational Technology, 53(1), 41-57. https://doi.org/10.1111/bjet.13151

Caena, F., & Redecker, C. (2019). Aligning teacher competence frameworks to 21st century challenges: The case for the European Digital Competence Framework for Educators (Digcompedu). European            Journal of Education, 54(3), 356-369. https://doi.org/10.1111/ejed.12345

Caena, F., & Redecker, C. (2019). European framework for the digital competence of educators: DigCompEdu. European Commission. https://doi.org/10.2759/58087

Çebi, A., & Reisoğlu, İ. (2020). Digital competence: A study from the perspective of pre-service teachers in Turkey. Journal of New Approaches in Educational Research (NAER Journal), 9(2), 294-308.

Çebi, A., & Reisoğlu, İ. (2020). Gender differences in digital competence among pre-service teachers. Computers & Education, 150, 103856. https://doi.org/10.1016/j.compedu.2020.103856

Civís Zaragoza, A., & Martínez, A. (2021). Teachers’ digital competence: Factors influencing their development and application. Journal of Digital Learning in Teacher Education, 37(2), 90-104. https://doi.org/10.1080/21532974.2021.1922154


Civís Zaragoza, M., Díaz-Gibson, J., Caparrós, A. F., & Solé, S. L. (2021). The teacher of the 21st century: professional competencies in Catalonia today. Educational Studies, 47(2), 217-237. https://doi.org/10.1080/03055698.2019.1686697

Ekberg, J., & Gao, X. (2018). The challenges of integrating ICT in education: An analysis of teacher training programs. Journal of Educational Technology, 22(3), 341-358. https://doi.org/10.1080/09720502.2018.1523400

Ekberg, S., & Gao, S. (2018). Understanding challenges of using ICT in secondary schools in Sweden from teachers’ perspective. The International Journal of Information and Learning Technology, 35(1), 43-55. https://doi.org/10.1108/IJILT-01-2017-0007

Fernández-Batanero, J. M., Fernández-Díaz, E., & García-Ruiz, J. (2020). Teachers' attitudes towards digital technology: A systematic review. Computers in Human Behavior, 112, 106476. https://doi.org/10.1016/j.chb.2020.106476

Fernández-Batanero, J. M., Montenegro-Rueda, M., Fernández-Cerero, J., & García-Martínez, I. (2020). Digital competences for teacher professional development. Systematic review. European Journal of Teacher Education, 1-19. https://doi.org/10.1080/02619768.2020.1827389

Fernández-Batanero, J. M., Rodríguez-Triana, M. J., & García-Valcárcel, A. (2019). The impact of digital resources on teacher professional development: A review. Journal of Computer Assisted Learning, 35(6), 725-739. https://doi.org/10.1111/jcal.12377

Fernández-Batanero, J., García-Peñalvo, F. J., & García-Sánchez, J. N. (2020). Barriers to the integration of technology in education: An analysis of Spanish teachers. Computers & Education, 149, 103835. https://doi.org/10.1016/j.compedu.2020.103835

Fernández-Batanero, J. M., Montenegro-Rueda, M., Fernández-Cerero, J., & García-Martínez, I. (2020). Digital competences for teacher professional development. Systematic review. European Journal of Teacher Education, 1-19. https://doi.org/10.1080/02619768.2020.1827389

Garzón Artacho, E., Martínez, T. S., Ortega Martin, J. L., Marín Marín, J. A., & Gómez García, G. (2020). Teacher training in lifelong learning—The importance of digital competence in the encouragement of teaching            innovation. Sustainability, 12(7), 2852. https://doi.org/10.3390/su12072852

Garzón Artacho, J. S., & López M. (2020). Innovations in teaching with digital technology: Trends and effects. Educational Technology Research and Development, 68(4), 1897-1916. https://doi.org/10.1007/s11423-020-09704-0

Ghomi, M., & Redecker, C. (2018). Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. European Journal of Education, 541–548. https://doi.org/10.1111/ejed.12345

González-Rodríguez, C., & Urbina-Ramírez, S. (2020). Análisis de instrumentos para el diagnóstico de la competencia digital. Revista Interuniversitaria de Investigación en Tecnología Educativa, 1–12. https://doi.org/10.6018/riite.411101


Greenier, V. (2018). Project-based learning and student engagement in the digital age. Educational Research Review, 24, 10-22. https://doi.org/10.1016/j.edurev.2018.02.001

Greenier, V. T. (2020). The 10Cs of project-based learning TESOL curriculum. Innovation in Language Learning        and Teaching, 14(1), 27-36. https://doi.org/10.1080/17501229.2018.1473405

Guillén-Gámez, F. D., Mayorga-Fernández, M. J., & Álvarez-García, F.J. (2020). Un estudio sobre el uso real de la competencia digital en el ejercicio profesional de la titulación de Educación. Tech Know Learn 25, 667–684. https://doi.org/10.1007/s10758-018-9390-z

Guillén-Gámez, F. D., Mayorga-Fernández, M. J., & Contreras-Rosado, J. A. (2021). Incidence of gender in the digital competence of higher education teachers in research work: Analysis with descriptive           and            comparative  methods. Education Sciences, 11(3), 98. https://doi.org/10.3390/educsci11030098

Guillén-Gámez, F., Pérez-González, J. C., & López, R. (2021). The role of gender in digital competence among university professors: A case study. Higher Education, 81(3), 561-578. https://doi.org/10.1007/s10734-020-00589-5

Hinojo-Lucena, F. J., & Carrión-Mero, P. (2019). The influence of teaching experience on teachers' digital competence. Technology, Pedagogy and Education,            8(2),185-199. https://doi.org/10.1080/1475939X.2018.1531017

Hinojo-Lucena, F. J., Aznar-Diaz, I., Cáceres-Reche, M. P., Trujillo-Torres, J. M., & Romero- Rodriguez, J. M. (2019). Factors influencing the development of digital competence in teachers: Analysis of the teaching staff of permanent education centres. IEEE Access, 7, 178744-178752. https://doi.org/10.1109/ACCESS.2019.2957438

Ilomäki, L., & Lakkala, M. (2018). Digital technology and practices for school improvement: innovative digital school model. Research and practice in technology enhanced learning, 13(1), 1-32. https://doi.org/10.1186/s41039-018-0094-8

Ilomäki, L., & Lakkala, M. (2018). Exploring the role of digital technology in educational innovations: A framework  for analysis.  Education and Information Technologies, 23(5), 1987-2004. https://doi.org/10.1007/s10639-018-9735-0

Instefjord, E. J., & Munthe, E. (2017). Educating digitally competent teachers: A study of integration of professional digital competence in teacher education. Teaching and teacher education, 67, 37-45. https://doi.org/10.1016/j.tate.2017.05.016

Jiménez-Hernández, A., & Sánchez-Vera, M. (2020). Gender differences in digital pedagogy among teachers: An empirical study. Journal of Educational Computing Research, 58(5), 937-953. https://doi.org/10.1177/0735633120904672

Jiménez-Hernández, D., González-Calatayud, V., Torres-Soto, A., Martínez Mayoral, A., & Morales, J. (2020). Digital competence of future secondary school teachers: Differences according to gender, age, and branch of knowledge. Sustainability, 12(22), 9473. https://doi.org/10.3390/su12229473


Jiménez-Hernández, D., Morales, P., & Sánchez, C. (2020). Gender differences in digital competence among teachers: A comprehensive study. Education and Information Technologies, 25(1), 35-50. https://doi.org/10.1007/s10639-019-09925-6

Kim, C., & Hannafin, M. J. (2019). Digital learning resources and student achievement: A meta-analysis. Journal          of Research on Technology in Education, 51(2), 112-127. https://doi.org/10.1080/15391523.2019.1579086

Kim, H. J., Hong, A. J., & Song, H. D. (2019). The roles of academic engagement and digital readiness in students’ achievements in university e-learning environments. International Journal of Educational Technology in Higher Education, 16(1), 1-18. https://doi.org/10.1186/s41239-019- 0152-3

López, M., & Vázquez, A. (2019). The impact of technology training on teachers’ digital competence: A longitudinal study. International Journal of Educational Technology, 17(4), 567-580. https://doi.org/10.1007/s10462-019-09745-2

López, R., Pérez, M., & Morales, J. (2023). Age and ongoing professional development in digital competence: Insights from recent studies. Journal of Educational Computing Research, 61(2), 203-223. https://doi.org/10.1177/07356331221120512

Lorenzo-Seva, U., & Van Ginkel, J. R. (2016). Multiple imputation of missing values in exploratory factor analysis of multidimensional scales: estimating latent trait scores. Annals of Psychology, 32(2), 596-608. https://doi.org/10.6018/analesps.32.2.215161

Lucas, B., & Barton, L. (2021). Digital competence in teachers: The impact of age and experience. Journal            of Technology in Teacher Education, 29(1), 45-62. https://doi.org/10.1080/10509585.2021.1898604

Lucas, L., Smith, H., & Johnson, R. (2021). Digital competence across age groups: A study of teacher proficiency. Journal of Digital Education, 12(4),            350-367. https://doi.org/10.1080/12345678.2021.2021045

Lucas, M., Bem-Haja, P., Siddiq, F., Moreira, A., & Redecker, C. (2021). The relation between in- service teachers' digital competence and personal and contextual factors: What matters most?. Computers & Education, 160, 104052. https://doi.org/10.1016/j.compedu.2020.104052

Maderick, J. A., Zhang, S., Hartley, K., & Marchand, G. (2016). Preservice teachers and self- assessing digital competence. Journal of Educational Computing Research, 54(3), 326-351. https://doi.org/10.1177/073563311562043

Maderick, J. M., & Bennett, J. C. (2016). Self-perceived digital competence among educators: A pretest-posttest evaluation. Computers & Education, 99, 73-80. https://doi.org/10.1016/j.compedu.2016.04.006

Marco Común de Competencia Digital Docente. (2017). Instituto Nacional de Tecnologías Educativas y Formación del Profesorado. Retrieved from: https://aprende.intef.es/sites/default/files/2018- 05/2017_1020_Marco-Com%C3%BAn-de-Competencia-Digital-Docente.pdf


McKnight, K., O'Malley, K., Ruzic, R., Horsley, M. K., Franey, J. J., & Bassett, K. (2016). Teaching in a digital age: How educators use technology to improve student learning. Journal of research on technology in education, 48(3), 194-211. https://doi.org/10.1080/15391523.2016.1175856

McKnight, L., & O'Malley, P. (2016). The use of digital resources in enhancing instructional practice. Learning & Instruction, 45, 1-12. https://doi.org/10.1016/j.learninstruc.2016.03.002

Montenegro, S., Raya, E., & Navaridas, F. (2020). Percepciones Docentes sobre los Efectos de la Brecha Digital en la Educación Básica durante el Covid - 19. Revista Internacional de Educación para la Justicia Social, 9(3), 317–333. https://doi.org/10.15366/riejs2020.9.3.017

Oliver, M., & Jaramillo, C. (2022). Age-related differences in digital competencies: A focus on pedagogical    applications.    Computers    &    Education, 170,  104280. https://doi.org/10.1016/j.compedu.2021.104280

Oliver, R., & Jaramillo, R. (2022). The evolution of digital pedagogy and its impact on teaching practices. Education Technology Research and Development, 70(4), 947-965. https://doi.org/10.1007/s11423-021-09979-8

Pardo, I., García, M., & Castaño, M. (2019). Aging and digital skills: Challenges for educators. International Journal of Adult Vocational Education and Technology, 10(3), 50-63. https://doi.org/10.4018/IJAVET.2019070104

Patiño-Escarcina, F., & Martín-Piñón, R. (2021). Innovative teaching strategies: The role of robotics and            project-based learning. Computers in Human Behavior, 117, 106665. https://doi.org/10.1016/j.chb.2021.106665

Patiño-Escarcina, R. E., Barrios-Aranibar, D., Bernedo-Flores, L. S., Alsina, P. J., & Gonçalves, L. M. (2021). A methodological approach to the learning of robotics with edurosc-kids. Journal of Intelligent & Robotic Systems, 102(2), 1-23. https://doi.org/10.1007/s10846-021-01400-7

Pozo Sánchez, S., López Belmonte, J., Fernández Cruz, M., & López Núñez, J. A. (2020). Correlational   analysis   of   the   incident   factors   in   the   level   of digital competence of teachers. Revista Electrónica Interuniversitaria de Formación del Profesorado, 23(1). https://doi.org/10.6018/reifop.39674

Pozo, J. I., & Gómez, A. (2020). Digital literacy among teachers: Gender disparities and implications. Journal of Educational Technology & Society, 23(1), 145-157. https://doi.org/10.1109/JETCAS.2020.2996392

Recomendación del Consejo, de 22 de mayo de 2018, relativa a las competencias clave para el aprendizaje permanente. (2018, mayo). Diario Oficial de la Unión Europea. EUR-Lex - 32018H0604(01) - EN - EUR-Lex (europa.eu)

Redecker, C., & Punie, Y. (2017). European Framework for the Digital Competence of Educators. JRC, 1–95. https://ideas.repec.org/s/ipt/iptwpa.html

Sangrá, A., & González, J. (2019). Student-centered learning in the digital age: New challenges for educators. Journal of Educational Computing Research, 57(2), 371-389. https://doi.org/10.1177/0735633118813877


Sangrá, A., Raffaghelli, J. E., & GuitertCatasús, M. (2019). Learning ecologies through a lens: Ontological, methodological and applicative issues. A systematic review of the literature. British Journal of Educational Technology, 50(4), 1619-1638. https://doi.org/10.1111/bjet.12795

Smith, L., & Johnson, H. (2022). Gender and digital competence: A comparative study. Technology, Pedagogy and Education, 31(1), 77-95. https://doi.org/10.1080/1475939X.2021.1964792

Torres, C., & López, P. (2023). Bridging the digital divide: Emerging trends in teacher training and access.            Journal of Technology in Education, 29(2), 102-118. https://doi.org/10.1016/j.jte.2022.10.009

Torres, E., Martínez, P., & Gómez, F. (2023). The interplay of age and experience in digital competency among educators. Educational Technology Research and Development, 71(2), 155-172. https://doi.org/10.1007/s11423-023-10022-3

Torres, M., & Tovar, J. (2023). Experience versus youth in digital competence: A comparative study of educational practices. Educational Technology Research and Development, 71(1), 55-72. https://doi.org/10.1007/s11423-023-10157-2

UNICEF (2020).  La brecha digital impacta en la educación. Retrieved from: https://www.unicef.es/educa/blog/covid-19-brecha-educativa