Skip to main content

To strengthen and further inform our own pedagogy and practice we sought to collaborate across institutions. It offered us additional opportunities to examine our understanding, pedagogy, and practice through intentional inquiry, critical review, and examination of our own pedagogy and practice. In addition, the creation of the Targeted Field Observation Guide (Figure 1) as a means of more directly connecting our preservice teachers’ pre-student teaching field placement and the content of our respective courses is in response to Whitford and Villaume’s (2014) as well as Zeichner and McDonald’s (2011) findings related to the disconnect between preservice teachers’ field placements and their university coursework.

Figure 1.

Specifically, when framing preservice teachers’ field placement-based experiences and learning, teacher educators struggle to help their preservice teachers make clear, explicit connections between theory, most often embedded in coursework, and practice, usually connected to field placement experiences. Additionally, research connected to aspects of teaching and learning, specifically research in P–12 contexts and teacher education coursework further informed our work (e.g. Darling-Hammond 2006; Whitford & Villaume, 2014; Zeichner & McDonald, 2011). These scholars’ calls for more explicit, ongoing connections between field placements and teacher education coursework further compelled us to be intentional about creating space and time in our courses for preservice teachers to reflect on their field-based observations, with the aim toward more readily integrating their field-experiences with the content of the courses we taught.

Computer-mediated communication

Prior to the start of the required field placement experience, we each introduced the Targeted Field Observation Guide (see Figure 1) to our respective preservice teachers, explaining its purpose and our desire to use this guide to further support their abilities to observe, notice, and reflect on their required field-based experiences as well as make connections between these experiences and what they learned in our classes. To capture preservice teachers’ observations as well as facilitate preservice teachers’ reflections, we introduced our preservice teachers to computer-mediated communication through the use of Google Documents (Docs). In both courses, preservice teachers were required to use instructor-generated, multi-user Google Docs to capture their observations and reflections. Similar to an online discussion board, we provided prompts each week connected to the guide. As part of their coursework, preservice teachers were expected to respond to these prompts as well as their group members’ responses. Tara used random assignment to set up and assign preservice teachers to small Google Docs groups, and each DU group contained five preservice teachers. Lynn selectively assigned preservice teachers to one of five small groups so that each group reflected a variety of disciplines. Each WC Google Doc group contained three preservice teachers.

To organize, share, and reflect on their field-based observations and experiences, we instructed participants to choose and apply one target from the Targeted Field Observation Guide each week. At times, and when possible, these targets specifically aligned with assigned readings or course-related topics. During some weeks, we also chose to direct preservice teachers to choose a target they had not yet utilized. In other instances, to further facilitate the preservice teachers’ collective discussion and reflection, we selected one target every preservice teacher needed to use to frame their observations during the upcoming week’s field placement experience(s), which would support small and whole group conversations focused on a specific target.

For clarity, when we first introduced the guide to our preservice teachers, we encouraged them to view each target as a “hat” they could metaphorically “wear” during their field placement experiences in the hopes that this metaphor might better help them hone in on a particular target to observe. The language in some of the participants’ posts and peer comments reflect this use of metaphor. However, in our ongoing reflections throughout the study and in our follow-up analyses, we realized that the “hat” metaphor was not as useful in encouraging our preservice teachers to frame their observations. Thus, in revisions to the guide based on findings from this study, we changed the term “hat” to “target” to more clearly align with the overall purposes of the guide.

As noted previously, we utilized Google Doc groups to allow participants to access authentic audiences of peers with whom to share their observations and reflections. At the beginning of each class period, participants utilized a smart device (i.e., computer, smartphone, tablet) to read their group members’ posts. During this reading exercise, they also added additional responses, ideas, and thinking within their assigned small group Google Docs. Each preservice teacher chose a separate color and used it to post their weekly reflections and comment on group members’ reflections (see Figure 2).

Figure 2. Threaded discussion excerpt from Delta University (DU) Group 1. (Note: In its original format, students’ responses were in different colors. Responses have been bolded here to distinguish between participants’ contributions.)

It is important to note that while we initially classified participants based on gender and group number (see Figure 2), gender was not an aspect explored in this study. After reading and responding to group members’ posts, participants often engaged in short instructor-facilitated, whole- and small-group dialogues during class. As course instructors, we had access to our respective preservice teachers’ posts and comments, although preservice teachers only had access to their assigned group’s Google Docs. Each week Tara read through every group’s weekly dialogue and inserted comments and questions as a means of providing further guidance and feedback. Lynn read through WC preservice teachers’ weekly posts but chose not to add comments or questions, doing so intentionally to further empower preservice teachers to engage with their peers’ observations.

Analysis and Review

Data analyses included content analysis and clustering (Miles et al., 2014) as well as constant comparative methodology (Corbin & Strauss, 2008) in which we examined preservice teachers’ Google Doc responses. Throughout this process, we specifically sought to identify how the implementation of the guide was used in our respective courses to frame preservice teachers’ field-based observations and reflections. We also worked to understand the ways in which our preservice teachers’ uses of the guide enabled them, if at all, to connect and align course content with their embedded pre-student teaching field experiences. Throughout data analyses we engaged in multiple peer debriefs and member checks as we sought to utilize these data to triangulate and inform our analyses and findings (Lincoln & Guba, 1985).

As our preservice teachers’ Google Doc posts were central to their interactions and work with the Targeted Field Observation Guide, we engaged in a multileveled review of the preservice teachers’ Google Doc responses. In the first round of review, we read through and examined all preservice teachers’ posts individually within their small groups. During this initial round of data analysis, we used the five targets as codes. Specifically, we coded and checked for evidence of participants’ references to and uses of the Targeted Field Observation Guide—specifically identifying when preservice teachers used or referenced the targets within the Targeted Field Observation Guide. Doing so also allowed for the identification of the total number of times each target was referenced across both courses (see Table 1).

Then, in the second round of analysis we separately re-read and analyzed participants’ posts within the context of their small groups, noting the ways in which participants used the same targets to share their observations and reflections about their weekly placement experiences and, in some instances, responded to the peers’ uses of specific targets. During this round, we sought to identify similarities and differences between the ways preservice teachers utilized each target in their weekly reflections.

During a third round of review, in addition to maintaining the coding completed previously using the targets, we then compared the similarities and differences we noted. During this process, we drew on constant comparative methodology (Corbin & Strauss, 2008) as we noted similarities and differences between participants’ uses of the targets. Comparing our findings, we noted—more broadly—two types of posts. Specifically, we identified “report-based” posts, which reflected participants’ ability to provide direct observations of their field experiences and “analysis-based” posts, which revealed participants’ capacity to go beyond direct observations and also include elements of analysis and, in some cases, extended connections and thinking based on what they observed. We then utilized clustering (Miles et al. 2014) to attend to and identify the ways participants’ reflections referencing one of the five targets revealed evidence of these two types of posts. Using these two categories we then conducted a fourth round of analysis, in which we categorized all responses containing a reference to a target across each group’s Google Document as either “report-based” or “analysis-based.” In the fifth, and final, round of review we conducted analyses to compare DU and WC participants’ “report-based” and “analysis-based” responses, looking for any additional similarities or differences between participants’ “report-based” and “analysis-based” posts. During content analysis (Miles et al., 2014) we also examined and coded for evidence of participants’ contributions to their small group’s Google Docs, which demonstrated connections between their field placement experience(s) and their coursework (see Table 2).

To ensure coding reliability during this multileveled review and analysis process, we also engaged additional reviewers at two separate points during data analysis to maintain intercoder agreement. Specifically, we conducted multiple rounds of interrater reliability, which included individual and collaborative analysis and check-ins to ensure accuracy and agreement between reviewers (Lombard, Snyder-Duch, & Campanella Bracken, 2002; Miles et al., 2014).

References

Corbin, J., & Strauss, A. (2008). Basics of qualitative research: Techniques and procedures for developing grounded theory (3rd ed.). Thousand Oaks, CA: Sage Publications[Crossref], , [Google Scholar]

Darling-Hammond, L. (2006). Powerful teacher education: Lessons from exemplary programs. San Francisco, CA: Jossey-Bass [Google Scholar]

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage Publications[Crossref], , [Google Scholar]

Lombard, M., Snyder-Duch, J., & Campanella Bracken, C. (2002). Content analysis in mass communication: Assessment and reporting of intercoder reliability. Human Communication Research28(4), 587–604. doi:10.1111/j.1468-2958.2002.tb00826.x[Crossref][Web of Science ®], , [Google Scholar]

Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook (3rd ed.). Los Angeles, CA: Sage Publications [Google Scholar]

Whitford, B. L., & Villaume, S. K. (2014). Clinical teacher preparation: A retrospective. Peabody Journal of Education89(4), 423–435. doi:10.1080/0161956X.2014.938590[Taylor & Francis Online], , [Google Scholar]

Zeichner, K. M., & McDonald, M. (2011). Practice-based teaching and community field experiences for prospective teachers. In A. Cohan & A. Honigsfeld (Eds.), Breaking the mold of prospective and inservice teacher education: Innovative and successful practices for the twenty-first century (pp. 45–54). Lanham, MD: Rowman & Littlefield Education [Google Scholar]

Leave a Reply