Research Articles

Investigating the Relationship Between TPACK and the ISTE Standards for Teachers

Author: Josh DeSantis (York College of Pennsylvania)

  • Investigating the Relationship Between TPACK and the ISTE Standards for Teachers

    Research Articles

    Investigating the Relationship Between TPACK and the ISTE Standards for Teachers

    Author:

Abstract

Technology is rapidly changing American classrooms. This has profound implications for teacher preparatory institutions seeking to ready pre-service teachers to thrive in technology-integrated environments. Two frameworks have been developed that aide teacher-educators in designing programs that help pre-service teachers integrate technology during their instruction. The first, titled, Technological Pedagogical Content Knowledge (Mishra & Koehler, 2006), primarily emphasizes what effective technology integrators know. The second, titled, the National Education Technology Standards for Teachers (International Society for Technology in Education, 2008), primarily emphasizes what effective technology integrators do. To this point, little progress has been made determining if relationships exist between these frameworks. This study explores the existence of a relationship between pre-service teachers’ Technological Pedagogical Content Knowledge levels and their technology proficiencies described by the National Education Technology Standards for Teachers.

Keywords: Improving Classroom Teaching, Pedagogical Issues, Postsecondary Education, Secondary Education, Teaching/Learning Strategies

How to Cite: DeSantis J., (2016) “Investigating the Relationship Between TPACK and the ISTE Standards for Teachers”, Issues and Trends in Educational Technology 4(1). doi: https://doi.org/10.2458/azu_itet_v4i1_desantis

Downloads:
Download PDF

2997 Views

2982 Downloads

Published on
01 Apr 2016
Peer Reviewed

Investigating the Relationship Between TPACK and the ISTE Standards for Teachers

Josh DeSantis
York College of Pennsylvania

Introduction

Great teachers have a deep understanding of both their subject and the techniques best suited to help their students understand essential content. These stores of teaching techniques and of teaching knowledge are crucial assets for classroom teachers (Shulman, 1986). The very best teachers are also capable of drawing from their subject knowledge and store of teaching techniques by modifying their actions to address classroom variables. These can include students' interests, the tools available in the classroom, curricular demands, and the presence or absence of parental support. By using their knowledge of their subjects and their understanding of effective teaching techniques, great teachers can respond to a wide range of classroom demands. The chief mission of teacher education, readying great teachers, centers on helping to fill pre-service teachers' skill and knowledge reservoirs (Darling-Hammond, 2012).

While subject and pedagogical knowledge remain important, classroom teachers are also increasingly expected to utilize education technologies in their instruction (Johnson, Adams Becker, Estrada & Freeman, 2014). The emerging importance of education technology in the classroom has born a complementary need to ready pre-service teachers with technology skills (Sang, Valcke, Braak & Tondeur, 2010). The Technological, Pedagogical, and Content Knowledge (TPACK) framework, developed by Mishra & Koehler (2006), is a leading measure used to determine teachers' knowledge of how to integrate technology in their instruction. It describes the knowledge possessed by teachers who are effective technology integrators. A second framework created by the International Society for Technology in Education (ISTE) and titled the ISTE Standards for Teachers (ISTE-ST) designates the technology-facilitated skills required of teachers. Both of these frameworks are used to guide teacher educators in developing curricula to help pre-service teachers learn the content, pedagogies, and technology skills they will need during their careers.

While these frameworks have been used to anchor pre-service teaching curricula, no studies have yet identified if a relationship exists between them. It would stand to reason that teachers who possess a vast store of TPACK would likely also be proficient technology users as described by the ISTE-ST. If teachers' possession of TPACK is correlated to their ISTE-ST proficiency, both models would be strengthened. This would validate the efforts of pre-service curriculum designers who use the models in their curriculum designs. If the models are not correlated, it could offer an opportunity to reflect on the precision of both and could spur renewed investigation into the best methods for readying pre- service teachers to integrate technology in their teaching.

The purpose of this study was to identify if relationships exists between pre-service teachers' TPACK and ISTE-ST proficiencies following their participation in an educational technology course and field experience. Participants included pre-service teachers enrolled in an education technology class and completing a technology-oriented field experience. Participants' TPACK was identified using the Technological and Pedagogical Knowledge Survey (TPKS), which utilized items from an instrument created by Schmidt et al. (2010). The TPKS collected students' perceptions of their own TPACK. Participants' ISTE-ST proficiency was assessed using the Wayfind Teacher Assessment (WTA). The WTA was created by the Learning.com Cooperation (Learning.com, 2013) to measure teachers' ISTE-ST aligned technology proficiencies. The data gathered from these instruments was used to address five research questions: (1) Did the technology- oriented field experience affect participants' TPACK? (2) Does the degree to which pre- service teachers' TPACK changed following their participation in a technology-oriented field experience effect their technology proficiency as described by the ISTE-ST? (3) Is there a significant difference between pre-service teachers' post-experience TPACK and their technology proficiency as described by the ISTE-ST? (4) Are any of the five WTA subscales correlated to pre-service teachers' TPACK? , and (5) Are any of the three TPKS subscales correlated to pre-service teachers' technology proficiency as described by the ISTE-ST ? The findings from these questions help to inform the discussion regarding the relationship between teachers' technology-related knowledge and skills.

What effective teachers know

The TPACK model has its roots in the teacher knowledge model devised by Shulman (1986). Shulman's model described knowledge possessed by teachers who successfully navigate complex classroom dynamics. His work identified three broad categories of knowledge possessed by effective teachers. The first, content knowledge (CK), is the information teachers possess about the subject they teach. The second, pedagogical knowledge (PK), refers to the techniques teachers employ to instruct their students. A third important area of knowledge identified by Shulman, pedagogical content knowledge (PCK), refers to the techniques best suited to teach specific concepts within a given teacher's subject area. For example, a social studies teacher with a deep understanding of the American Revolution and a stockpile of instructional techniques might design a dramatization in which her students pretend to be Tories and Patriots. According to Shulman, this teacher's pedagogical choice demonstrates that they possess a high level of PCK. Teachers who possess high levels of CK, PK, and PCK, Shulman reasoned, are likely to be effective teachers in nearly any context.

The years since the introduction of Shulman's theory have witnessed an influx of educational technologies in classrooms across the country (Johnson et al., 2014). Mishra and Koehler (2006) accounted for this by supplementing Shulman's model of teacher knowledge by introducing a third main category of teacher knowledge. They titled this category technological knowledge. The resulting model, called TPACK, includes the knowledge that teachers possess about how to employ technologies into their instruction. Their framework added technological knowledge (TK), as well as technological pedagogical knowledge (TPK), and technological content knowledge (TCK) to Shulman's original theory. Mishra and Koehler's TPACK framework suggests that effective teachers possess knowledge about their subject, effective instructional techniques and methods for using technology to help students achieve understanding. For example, an upper elementary teacher might draw on her mathematics content knowledge for finding the area of three dimensional objects, her pedagogical knowledge of designing small-group activities, and her technology knowledge of math software and websites to design a jigsaw activity in which students travel in small groups to practice finding the area of different shapes with the assistance of various computer programs and websites. Figure 1 displays Mishra and Koehler's TPACK framework.

diagram of the TPACK framework
Figure 1, The TPACK Framework. Reproduced by permission of the publisher, © 2012 by tpack.org

TPACK has become a leading model for understanding the knowledge required by teachers to integrate technology in their instruction (Koehler et al., 2014). It has also been confirmed as a key measure describing teachers' readiness to employ technology in their instruction (Chai, Koh & Tsai, 2013; Harris & Hofer, 2011).The possession of high levels of TPACK has been demonstrated to be both a worthwhile and attainable goal for pre-service teachers (Bate, Day & Macnish, 2013; Jang & Chen, 2010; Mouza, et al., 2014). TPACK serves as an important foundation for describing what effective teachers know about their disciplines, techniques, and technologies.

Though the TPACK model has been successfully applied in numerous studies (Bate, Day & Macnish; Hofer, Grandgenett, Harris, and Swan, 2011; Ling Koh, Chai & Tay, 2014; Mouza, et al., 2014), it has been subject to critique. Graham (2011) challenged several of the fundamental assumptions underlying TPACK. These criticisms included a lack of consensus regarding Shulman's original concepts of CK, PK, and PCK, ambiguity over the meaning of the new knowledge categories, and an absence of evidence showing relationships between the new knowledge categories. Brantley-Dias & Ertmer (2013) also argued that the seven types of knowledge might not be sufficiently different from each other and that the TPACK model needlessly overcomplicates the nature of classroom technology integration. Many studies have included teachers' work samples, lesson plans, and performance products to assess the degree to which teachers possess TPACK (Koehler, Shin & Mishra, 2011). Less is known, however, about the relationships that exist between teachers' TPACK and the competencies designated by the ISTE-ST.

What effective teachers do

The ISTE-ST describes the actions of teachers who integrate technology in their practice. Created in 2000 and updated in 2014 by the International Society for Technology in Education, the ISTE-ST lays out the fundamental tenets of technology integration for teachers across all grade levels and disciplines (International Society for Technology in Education, 2014). The ISTE-ST includes the following standards: Facilitate and Inspire Student Learning and Creativity (SLC), Design and Develop Digital Age Learning Experiences and Assessments (DALEA), Model Digital Age Work and Learning (DAWL), Promote and Model Digital Citizenship and Responsibility (DCR), and Engage in Professional Growth and Leadership (PGL) (International Society for Technology in Education, 2014). Embedded within each standard are indicators that provide teachers with specific descriptions of the targeted performances. Taken together, these standards outline the teaching practices of exemplary classroom technology integrators.

While the competencies described in the ISTE-ST require teachers to integrate technology, the standards do not designate specific technologies that must be mastered. For example, a teacher who assigns her students a project in which they create mock news documentaries covering current events would be addressing the Design and Develop Digital Age Learning Experiences and Assessments standard. The teacher could choose from a variety of applications, including IMovie, Camtasia Studio, or Animoto for her students to use. The freedom to choose individual technologies ISTE-ST allows teachers to utilize the tools they feel best allow them to achieve their goals and allows the standards to stay relevant despite the rapid emergence of new technologies. The open-endedness of the ISTE-ST implies that there might be many ways for teachers to learn the knowledge and skills they employ (Willis, 2012).

The importance of readying teachers to master the aptitudes described in the ISTE-ST has led many teacher educators to integrate the skills and knowledge embedded in the ISTE-ST into teacher education curricula. Basham, Smeltzer, and Pianfetti (2012) used a series of tutorials and activities with pre-service teachers and found a significant gain in their abilities to employ the skills designated by the ISTE-ST. Similarly, Lambert and Gong (2010) found that education technology courses aligned to the ISTE-ST could help pre- service teachers take a positive approach toward technology integration and improve their proficiency with education technologies. The knowledge and skills embedded in the ISTE-ST are a creditable goal for teacher education programs.

Readying pre-service teachers to integrate technology

The rapid pace of classroom technology development has spurred teacher educators to design curricula that allow pre-service teachers to learn to integrate technologies during instruction. Though there is diversity in the structures used to promote pre-service teachers' TPACK and ISTE-ST proficiency, most teacher education programs include similar characteristics. Kleiner, Thomas, and Lewis (2007) conducted a nation-wide study on teacher certification programs and found that eighty-five percent of teacher preparation programs included stand-alone education technology coursework in their curricula and that ninety-three percent of the programs included education technology as part of their teaching methods coursework. In addition, Kleiner, Thomas, and Lewis (2007) found that seventy-nine percent of teacher preparation programs embedded education technology instruction during pre-service teachers' field experiences. Although there is some concern that the content and skills covered in those courses could be misaligned with the content and skills requires of classroom teachers (Ottenbreit-Leftwich et al., 2012), well-designed teacher education can play a powerful role in readying new teachers to integrate technology effectively in their classrooms (Tondeur et al., 2012). Field experience shows particular promise in this area. A recently completed study by Mouza et al. (2014) found a strong improvement in pre-service teachers' TPACK following their participation in integrated technology-oriented coursework and field experiences. These efforts have initiated the use of TPACK and the ISTE-ST as planning frameworks in many teacher-education programs.

The knowledge described by TPACK and the skills described by the ISTE-ST are likely to grow in significance for teachers in the future (Johnson et al., 2014). These models serve as touchstones for teacher educators seeking to design teacher education curricula that fully prepare new teachers for the realities they will face in their careers. While these two models describe the attributes and actions of effective technology integrators, there has not, as yet, been an effort to determine the relationship between the models. Moreover, relatively few studies have been completed identifying the degree to which possessing TPACK and adhering to the ISTE-ST affect teachers' abilities to deploy emerging technologies in the classroom. The present study explores the relationship between TPACK and the ISTE-TE as well as the connection between teacher's possession of TPACK and their ability to design technology-integrated classroom instruction.

Research design

Procedures

The participants in this study included 76 pre-service teachers who were enrolled in a mid- sized, private, liberal-arts college in South Central Pennsylvania as participants. Forty-two participants were enrolled in early elementary and special education certification programs and twenty-four of the participants were enrolled in secondary education programs. The participants ranged from eighteen to thirty-seven years in age. All of the participants were enrolled in one of six sections of a three-credit education technology course required of all education majors at the host site and offered during the 2014-15 academic year. This course presented students with Education Technology theories and models, like TPACK and the ISTE-ST. In addition, students in the course learned how to operate technology hardware such as interactive whiteboards and tablet computers, applications like IMovie and Prezi, and learning management systems like Edmodo and Class Dojo.

The course included a required field experience component in which participants worked directly with in-service host teachers and their students in classroom settings for a minimum of twenty hours. The field experience was conducted in a large and suburban school district in south central Pennsylvania. Participants worked with a host teacher in their chosen content area or grade-level certification area chosen by two technology leaders at the field experience site. Host teachers were selected by the technology leaders for their experience and for their proficiency with technology. This experience required participants to conduct a technology-centered interview with their host teachers, complete a formal observation and reflection of a technology-integrated lesson taught by their host teachers, and to create and teach a novel technology-integrated activity with students at the host site.

Thirty-two study participants were also enrolled in a one-credit stand-alone field experience course. The two technology leaders from the field-experience site taught this course, which met eight times during the study period. During each meeting, the technology leaders invited participants to reflect on their field experiences, sharing successes and lessons learned as well as setbacks and questions. These reflections were used to assess the participants in the stand-alone course as well as to troubleshoot problems encountered by participants during their field experience.

Data collection

Data were collected from two instruments during this study. The first instrument employed was the Technological Pedagogical Knowledge Survey (TPKS). The TPKS included twenty-two Likert-scaled items designed to gather self-reported information about participants' TPACK as described by Mishra and Koehler (2006). The instrument was created by Schmidt et al. (2010) and was used with their permission. Though the original instrument included questions that addressed all seven of the knowledge categories originally identified by Mishra and Koehler (2006), only the items addressing Technology Knowledge (TK), Pedagogical Knowledge (PK), and Technological-Pedagogical Knowledge (TPK) were used in the present study. Example items include "I can choose technologies that enhance students' learning for a lesson" and "I can assess student learning in multiple ways". It was further modified for the present study to include three open-ended items designed to gather qualitative data regarding participants' perceptions of their own technology skills and knowledge. The reliability of the quantitative items, as measured by the Cronbach's α for each subscale, was calculated when the original instrument was first presented by Schmidt et al. (2010). These values are reported in Table 1.

Table 1.
Cronbach's α For Subscales of the Technological and Pedagogical Knowledge Survey
Subscale Cronbach's α
Technology Knowledge (TK) .86
Pedagogy Knowledge (PK) .87
Technological Pedagogical Knowledge (TPK) .89

The second instrument used to gather data was the Wayfind Technology Assessment (WTA). The WTA was used to measure participants' proficiency with ISTE-ST aligned technology skills. It was created by the Learning.com cooperation to assist school districts in readying professional development for teachers aimed at boosting their ISTE- ST proficiency. The WTA included 60 multiple-choice and performance task items. These items required participants to demonstrate proficiency with each of the core ISTE-ST standards by performing actual technology tasks as well as answering multiple-choice items. These tasks and items assessed participants' actual technology proficiencies. Participants earned an overall score from 100-500 points. A participant who scored from four hundred to five hundred points was considered to possess an advanced ISTE-ST aptitude, a participant who scored from three hundred to four hundred was considered to possess a proficient ISTE-ST aptitude, a participant who scored from two hundred to three hundred was considered to possess a basic ISTE-ST aptitude, and a participant who scored from one hundred to two hundred was considered to possess below basic ISTE-ST aptitude. The WTA also reports 5 subscale scores, each tethered to the five ISTE-ST components: student learning and creativity, digital age learning experiences and assessments, digital-age work and learning, digital citizenship and responsibility, and professional growth and leadership. Banister and Vannatta-Reinhart (2013) evaluated the WTA to determine its effectiveness in determining the ISTE-ST aligned technology proficiencies of pre-service teachers. Their findings indicated that the WTA is a valid instrument for assessing pre-service teachers' technology proficiency.

The TPKS was administered twice during the study. The first administration occurred during the second meetings of each of the three sections of the stand-alone education technology course in September 2014 and again in January 2015. The second administration occurred during the final meetings of each of the three sections of the stand-alone education technology course in December 2014 and June 2015. The WTA instrument was administered to participants on the final meeting of their stand-alone education technology course in December 2014 and June 2015. Students who were enrolled in the course, but did not elect to participate in the study were excused from that class meeting. The TPKS and WTA were administered to participants digitally in a computer lab during each scheduled class section.

Results

Pre- and post-field experience TPACK

A paired sample t-test was employed to determine the changes in participants' TPACK following their coursework and field experience in education technology. The arrangement of item responses on a Likert scale ensured that the data from the Pre- and Post-TPKS samples were interval level. The descriptive data that identified the changes in participants TPACK among the three subscales and total scores as recorded by the pre- and post-experience TPKS surveys are displayed in Table 2.

Table 2
Pre- and Post-Experience TPKS Results
Measure N Mean SD
Pre-Experience Total Score 76 3.54 0.48
Post-Experience Total Score 76 4.09 0.41

The null-hypothesis for the paired-sample t-test, employed to determine if there was a statistically significant difference between participants' pre- and post-experience total score on the TPKS, was that there was no difference between the mean scores for the Pre-TPKS and Post-TPKS samples. The paired samples t-test indicated a p value of 0.04. This value is below the p = 0.05 threshold, indicating a rejection of the null hypothesis. The results of the paired-sample t-test demonstrated that the mean score for Post-TPKS samples was significantly higher than the mean score of the Pre-TPKS samples. Cohen's d model (1988) was employed to determine the effect size. By dividing the mean difference (0.63) scores by the standard deviation (0.56), d was calculated to be 0.62, which, according to Cohen's (1988) model, falls within the medium effect range. These results are illustrated in Table 3.

Table 3
Paired-sample t-test comparison of the Pre- and Post-treatment TPKS scores
df Mean SD t p
Pre- and Post-TPKS Scores 75 .63 .56 9.86 .04

Relationships between TPACK and ISTE-ST proficiency

Simple linear regression analysis was used to determine if the degree to which participants' TPACK as determined by their TPKS scores changed following their participation in the technology-integrated field experience was correlated to their technology proficiency as determined by their WTA scores. The scatterplot of standardized residuals versus predicted values showed that the data met the assumptions of homogeneity of variance and linearity. The data met the standard assumptions for simple linear regression allowing the researcher to proceed with the analysis. It was found that the rate of TPACK change among participants does not explain a statistically significant amount of the variance in participants' WTA scores ( F(1, 74) = 1.03, p < 0.31, with an R2 of = 0.01).

Simple linear regression analysis was also used to determine if participants' post-field experience TPACK as determined by their TPKS scores was correlated to their technology proficiency as determined by their WTA scores. The scatterplot of standardized residuals versus predicted values showed that the data met the assumptions of homogeneity of variance and linearity. The data met the standard assumptions for simple linear regression allowing the researcher to proceed with the analysis. It was found that the rate of TPACK change among participants did not explain a statistically significant amount of the variance in participants' WTA scores. The data met the standard assumptions for simple linear regression allowing the researcher to proceed with the analysis. It was found that participants' post-field experience TPACK scores does not explain a statistically significant amount of the variance in participants' WTA scores (F(1, 74) = 2.51, p < 0.12, with an R2 of = 0.03).

Relationships between TPACK and Wayfind subscales

Multiple linear regression analysis was employed to determine if participants' post-field experience TPACK score was correlated to specific ISTE-STs as determined by the WTA subscale scores (SLC, DALEA, DAWL, DCR, PGL). The choice to employ multiple linear regression required the testing of several assumptions about the data. An analysis of standard residuals was carried out, which showed that the data contained no outliers (Std. Residual Min = -2.04, Std. Residual Max = 2.38). Tests to see if the data met the assumption of collinearity indicated that multicollinearity was not a concern (Post TPACK Score, Tolerance = 0.96, VIF = 1.04). The data met the assumption of independent errors (Durbin-Watson value = 1.82). The scatterplot of standardized residuals versus predicted values showed that the data met the assumptions of homogeneity of variance and linearity. The data met the standard assumptions for simple linear regression allowing the researcher to proceed with the analysis. It was found that participants' post-field experience TPACK score was not correlated to any of the specific ISTE-STs as determined by the WTA subscale scores ( F(5, 70) = 0.58, p < 0.71, with an R2 of = 0.04).

Relationships between ISTE-ST competency and TPACK subscales

A multiple linear regression was calculated to determine if participants' ISTE-ST skills as determined by their total WTA scores were correlated to their TPACK sub scores (TK, PK, and TPK). The choice to employ multiple linear regression required the testing of several assumptions about the data. An analysis of standard residuals was carried out, which showed that the data contained no outliers (Std. Residual Min = -2.25, Std. Residual Max = 2.29). Tests to see if the data met the assumption of collinearity indicated that multicollinearity was not a concern (WTA Score, Tolerance = 0.88, VIF = 1.13). The data met the assumption of independent errors (Durbin-Watson value = 2.21). The scatterplot of standardized residuals predicted values showed that the data met the assumptions of homogeneity of variance and linearity. The data met the standard assumptions for simple linear regression allowing the researcher to proceed with the analysis. A significant regression equation was found (F(3, 72) = 3.32, p < 0.03, with an R2 of = 0.12). The analysis shows that TK did not significantly predict WTA totals (β = 2.48, t = 0.25, p < 0.81), however PK (β = -23.98, t = -2.08, p < .04) and TPK (β = -31.73, t = 2.49, p < .02) were negatively correlated to participants' WTA scores.

Discussion

The results of the analysis addressing research question one, Did the technology- oriented field experience affect participants' TPACK? demonstrated that the technology- oriented field experience did have a significant positive effect on participants' TPACK, as measured by the TPKS. This finding is consistent with other similar studies exploring the relationship between teacher education, field experience, and pre-service teachers' technology knowledge (DeSantis, 2015; Habowski & Mouza, 2014; Mouza, et al., 2014; Tai & Crawford, 2014). These findings add to the growing consensus on the value of high- quality teacher education and field experience for the development of TPACK among pre-service teachers.

The results addressing research question two, Does the degree to which pre-service teachers' TPACK changed following their participation in a technology-oriented field experience effect their technology proficiency as described by the ISTE-ST and three, Is there a significant difference between pre-service teachers' post-experience TPACK and their technology proficiency as described by the ISTE-ST? are more surprising. The results show no significant correlation between participants' TPACK, as measured by the TPKS and their ISTE-ST proficiencies, as measured by the WTA. Analysis of the relationships between each WTA subscale and participants' overall TPKS scores, completed to address research question four, Are any of the five WTA subscales correlated to pre-service teachers' TPACK? revealed no significant correlations. Analysis of the relationships between each TPKS subscale and participants' overall WTA scores, completed to address research question five, Are any of the three TPKS subscales correlated to pre-service technology proficiency as described by the ISTE-ST? revealed a significant negative relationship between participants' PK and WTA scores and a significant positive relationship between participants' TPK and WTA scores. Moreover, the negative relationship between participants' PK and TPK to the WTA suggests that proficiency with teaching techniques is inversely correlated to proficiency with education technologies. Participants who were less confident in their PK and TPK were actually mere effective technology users, as measured by the WTA. Taken together, these results fall short of conclusively establishing a relationship between the TPACK model and the ISTE-ST.

The proliferation of TPACK as a theoretical basis for teacher education centers on the assumption that what teachers report they know about employing education technologies correlates with their actual abilities to employ technology during instruction. TPACK has become an important tool among teacher educators both for designing instruction (Chai, Koh, Tsai & Tan, 2011; Doering, Veletsianos, Scharber & Miller, 2009; Koh & Divaharan, 2011) and for measuring the efficacy of that instruction (Graham et al., 2009; Schmidt et al., 2010). Just as important, teacher educators expect that possession of TPACK is a precursor to the technology proficiencies described by the ISTE-ST. The findings from the present study indicated that neither participants' TPACK improvements before and after the technology-oriented field experience, nor their post-field experience TPACK was correlated to their actual technology proficiencies as defined by the ISTE standards and described by participants' overall performance on the WTA.

Overall, the findings from the present study run contrary to the assertion that pre-service teachers' self-reported TPACK is linked to their actual abilities to utilize education technologies in the ways described by the ISTE-ST. These findings also invite a renewed examination of the use of TPACK as a theoretical basis for pre- and in-service technology-oriented course development (Chai, Koh, Tsai & Tan, 2011; Koh & Divaharan, 2011). The findings instead suggest that TPACK, in its current version, falls short of comprehensively explaining the range of knowledge required for teachers to integrate technologies successfully during their instruction. This supports a seam in the literature calling for a more critical and thorough analysis of the TPACK framework. Specifically, these critiques indicate a shared concern that the categories of knowledge described in the TPACK theory are too similar to each other and therefore are not comprehensive predictors of teachers' actual abilities to employ technology in instruction (Archambault & Barnett, 2010; Graham, 2011). Though these findings do not weaken the use of either the TPACK model or the skills implied by the ISTE-ST as theoretical frameworks for pre- or in-service program development, they do imply that neither theoretical framework should be used as a sole foundation for a program.

Several limitations influenced these findings. First, the relatively small and homogeneous population from the current study limits the generalizability of the findings. Future research might expand the population to include pre-service teachers at various stages of their programs and from different institutions, or in-service teachers at differing points in their careers. Second, this study explored quantitative data regarding participants' technology knowledge and skills. A more comprehensive understanding of the relationships between teacher's TPACK and technology skills, as implied by the ISTE-ST, might be achieved through qualitative analysis of participants' technology knowledge and skills. Finally, this study compared participants' self-reported TPACK as measured by the TPKS to their actual technology proficiency as measured by the WTA. A more comprehensive and accurate description of TPACK might be secured by including some element of participants' performance in their TPACK. An avenue for future research would be to augment participants' self-reporting instrument with an assessment of their actual ability to employ technology in their instruction. By employing the observation tool created by Hofer, Grandgenett, Harris & Swan (2011), future researchers might gather a more comprehensive summary of participants' TPACK. This might allow for a more accurate comparison of teachers' technology skills and knowledge.

Conclusion

Great teachers rely on both content knowledge and pedagogical skills. The TPACK model has been widely used to describe the knowledge possessed by effective teachers. Similarly, the ISTE-ST are frequently employed to define technology-integrated pedagogies. This investigation sought to determine if a relationship exists between what pre-service teachers know about education technologies as described by TPACK and their proficiency for using education technologies as described by the ISTE-ST. The findings from the present study suggest that these two frameworks are not related.

While more research is needed to determine the presence or absence of a relationship between TPACK and the ISTE-ST, the findings from this study suggest that teacher educators may want to avoid resting on either as the sole foundational framework for pre- and in-service teacher curriculum. Instead, teacher educators might create opportunities for teachers to cultivate both their technology knowledge and their tangible proficiencies with technology tools. Just as importantly, these findings suggest that teacher educators should include measures of what participants believe about their technology knowledge as well as their actual technology skills when assessing the efficacy of their coursework or professional development. By considering both models, teacher educators and leaders can design learning opportunities that ready teachers to make full use of emerging education technologies.

References

Archambault, L. M., & Barnett, J. H. (2010). Revisiting technological pedagogical content knowledge: Exploring the TPACK framework. Computers & Education, 55(4), 1656-1662.

Banister, S., & Vannatta-Reinhart, R. (2013). Assessing NETS-T Performance in Teacher Candidates: Exploring the Wayfind Teacher Assessment. Journal of Digital Learning in Teacher Education, 29(2), 59-65.

Basham, J., Smeltzer, A., & Pianfetti, E. (2012). An integrated framework used to increase preservice teacher NETS-T ability. Journal of Technology and Teacher Education, 13(2), 257-276.

Bate, F. G., Day, L., & Macnish, J. (2013). Conceptualising Changes to Pre-Service Teachers’ Knowledge of how to Best Facilitate Learning in Mathematics: A TPACK Inspired Initiative. Australian Journal of Teacher Education, 38(5), 2.

Brantley-Dias, L., & Ertmer, P. A. (2013). Goldilocks and TPACK: Is the Construct" Just Right?". Journal of Research on Technology in Education,46(2).

Bull, P., & Cisse, D. (2011, March). TPACK Model Integration: Preparing Preservice Teachers to Teach with Technology. In Society for Information Technology & Teacher Education International Conference (Vol. 2011, No. 1, pp. 4291-4296).

Chai, C. S., Koh, J. H. L., & Tsai, C. C. (2013). A Review of Technological Pedagogical Content Knowledge. Educational Technology & Society, 16(2), 31-51.

Chai, C. S., Koh, J. H. L., Tsai, C. C., & Tan, L. L. W. (2011). Modeling primary school pre-service teachers’ Technological Pedagogical Content Knowledge (TPACK) for meaningful learning with information and communication technology (ICT). Computers & Education, 57(1), 1184-1193.

Darling-Hammond, L. (2012). Powerful teacher education: Lessons from exemplary programs. John Wiley & Sons. Hoboken: NJ.

DeSantis, J. (2015). Technology-oriented field experience: Readying pre-service teachers to use emerging tools. The Field Experience Journal, 15(1), 2-20.

Doering, A., Veletsianos, G., Scharber, C., & Miller, C. (2009). Using the technological, pedagogical, and content knowledge framework to design online learning environments and professional development. Journal of Educational Computing Research, 41(3), 319-346.

Graham, R. C., Burgoyne, N., Cantrell, P., Smith, L., St Clair, L., & Harris, R. (2009). Measuring the TPACK confidence of inservice science teachers. TechTrends, 53(5), 70-79.

Graham, C. R. (2011). Theoretical considerations for understanding technological pedagogical content knowledge (TPACK). Computers & Education, 57(3), 1953-1960.

Habowski, T. & Mouza, C. (2014). Pre-service teachers’ development of technological pedagogical content knowledge (TPACK) in the context of a secondary science teacher education program. Journal of Technology and Teacher Education, 22(4), 471-495. Chesapeake, VA: Society for Information Technology & Teacher Education.

Harris, J. B., & Hofer, M. J. (2011). Technological pedagogical content knowledge (TPACK) in action: A descriptive study of secondary teachers' curriculum-based, technology-related instructional planning. Journal of Research on Technology in Education, 43(3), 211.

Hofer, M., Grandgenett, N., Harris, J., & Swan, K. (2011, March). Testing a TPACK-Based Technology Integration Observation Instrument. In Society for Information Technology & Teacher Education International Conference (Vol. 2011, No. 1, pp. 4352-4359).

International Society for Technology in Education. (2014). The ISTE Standards for Teachers. Retrieved from http://www.iste.org/docs/pdfs/20-14_ISTE_Standards-T_PDF.pdf

Jang, S. J., & Chen, K. C. (2010). From PCK to TPACK: Developing a transformative model for pre-service science teachers. Journal of Science Education and Technology, 19(6), 553-564.

Johnson, L., Adams Becker, S., Estrada, V., and Freeman, A. (2014). NMC Horizon Report: 2014 K-12 Edition. Austin, Texas: The New Media Consortium

Kleiner, B., Thomas, N., and Lewis, L. (2007). Educational Technology in Teacher Education Programs for Initial Licensure (NCES 2008–040). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.

Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The Technological Pedagogical Content Knowledge Framework. In Handbook of Research on Educational Communications and Technology (pp. 101-111). Springer New York.

Koehler, M. J., Shin, T. S., & Mishra, P. (2011). How do we measure TPACK? Let me count the ways. Educational technology, teacher knowledge, and classroom impact: A research handbook on frameworks and approaches, 16-31.

Koh, J. H., & Divaharan, H. (2011). Developing pre-service teachers' technology integration expertise through the TPACK-developing instructional model. Journal of Educational Computing Research, 44(1), 35-58.

Lambert, J., & Gong, Y. (2010). 21st century paradigms for pre-service teacher technology preparation. Computers in the Schools, 27(1), 54-70.

Learning.com. (2013). Wayfind Teacher Assessment Overview. Retrieved January 13th, 2014, from http://www.learning.com/docs/wfta/WayFind-Teacher-Assessment-Overview.pdf

Ling Koh, J. H., Chai, C. S., & Tay, L. Y. (2014). TPACK-in-Action: Unpacking the Contextual Influences of Teachers’ Construction of Technological Pedagogical Content Knowledge (TPACK). Computers & Education, 78, 20-29.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record, 108(6), 1017-1054.

Mouza, C., Karchmer-Klein, R., Nandakumar, R., Yilmaz Ozden, S., & Hu, L. (2014). Investigating the impact of an integrated approach to the development of preservice teachers' technological pedagogical content knowledge (TPACK).Computers & Education, 71, 206-221.

Ottenbreit-Leftwich, A. T., Brush, T. A., Strycker, J., Gronseth, S., Roman, T., Abaci, S., van Leusen, P., Shin, S., Easterling, W. and & Plucker, J. (2012). Preparation versus practice: How do teacher education programs and practicing teachers align in their use of technology to support teaching and learning? Computers & Education, 59(2), 399-411.

Sang, G., Valcke, M., Braak, J. V., & Tondeur, J. (2010). Student teachers’ thinking processes and ICT integration: Predictors of prospective teaching behaviors with educational technology. Computers & Education, 54(1), 103-112.

Schmidt, D. A., Baran, E., Thompson A. D., Koehler, M. J., Mishra, P. & Shin, T. (2010). Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers. Journal of Research on Technology in Education, 42(2), 123-149.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14.

Tai, S. J. D., & Crawford, D. (2014, March). The impact of field experience in technology-integrated classrooms on preservice teachers’ development of TPACK. In Society for Information Technology & Teacher Education International Conference (Vol. 2014, No. 1, pp. 2665-2668).

Tondeur, J., van Braak, J., Sang, G., Voogt, J., Fisser, P., & Ottenbreit-Leftwich, A. (2012). Preparing pre-service teachers to integrate technology in education: A synthesis of qualitative evidence. Computers & Education, 59(1), 134-144.

Willis, J. (2012). Adapting the 2008 NETS-T Standards for use in teacher education: Part II. International Journal of Technology in Teaching and Learning, 8(2), 78-97