Beginning last spring, I began publishing posts of classrooms in which I observed lessons (see here and here). These posts were one part of a larger research project on technology integration (see here).
Two questions have guided the case study design of the project:
In this and subsequent posts I will detail the methodology I use, what I mean by technology integration and describe models commonly used to determine its extent in schools.
The following posts are drafts that will be revised since I will be visiting more teachers and schools this fall. I welcome comments from readers who wish to take issue, suggest revisions, and recommend changes.
How I Got Started
In fall 2015, I wrote to district superintendents and heads of charter management organizations explaining why I was writing about instances of technology integration in their schools. At no point did these administrators ask me to define “technology integration” or even ask about the phrase; all seemed to know what I meant. In nearly all instances, the superintendent, school site administrator, technology coordinator, and CMO head invited me into the district. Administrators supplied me with lists of principals and teachers to contact. Again, neither my contacts nor I defined the phrase “technology integration” in conversations. They already had a sense of what the phrase meant.
I contacted individual teachers explaining how I got their names, what I was doing, and asked for their participation. More than half agreed. Because of health issues, I did not start the project until January 2016. For four months I visited schools and classrooms, observed lessons and interviewed staff. I resumed observations this fall and hope to complete all observations by December 2016.
In visiting classrooms, I interviewed teachers before and after the lessons I observed in their classrooms. During the observation, I took notes every few minutes about what both teacher and students were doing. I used a protocol to describe class activities while commenting separately about what both teacher and students were doing. I had used this observation protocol in previous studies. The point of the description and commentary was to capture what happened in the classroom, not determine the degree of teacher effectiveness. I avoided evaluative judgments about the worth of the lesson or teacher activities.
The major advantage of this approach is being in the room and picking up non-verbal and verbal asides of what is going on every few minutes as well as noting classroom conditions that often go unnoticed. I, as an experienced teacher familiar with schooling historically and the common moves that occur in lessons, can also assess the relationship between the teacher and students that other observers using different protocols or videos may miss or exclude. Teachers know that I will not judge their performance.
The major disadvantage of this way of observing lessons is the subjectivity and biases I bring to documenting lessons. So I work hard at separating what I see from what I interpret. I document classroom conditions from student and teacher desk arrangements through what is on bulletin boards, photos and pictures on walls, and whiteboards and which, if any, electronic devices are available in the room. I describe, without judging, teacher and student activities and behaviors. But biases, as in other approaches researching classroom life, remain.
After observing classes, I sit down and have half-hour to 45-minute interviews at times convenient to teachers. After jotting down their history in the district, the school, and other experiences, I turned to the lessons and asked questions about what teachers' goals were and whether they believed those goals were reached. Then, I asked about the different activities I observed during the lesson. One key question was whether the lesson I observed was representative or not of how the teacher usually teaches.
In answering these questions, teachers gave me reasons they did (or did not do) something in lessons. In most instances, individual teachers told me why they did what they did, thus, communicating a map of their beliefs and assumptions about teaching, learning, and the content they teach. In all of the give-and-take of these discussions with teachers I made no judgment about the success or failure of different activities or the lesson itself.
I then drafted a description of the lesson and sent it to the teacher to correct any factual errors I made in describing the lesson. The teacher returned the draft with corrections.[i]
To provide context for the classrooms I observed, I collected documents and used school and teacher websites to describe what occurred within each school and district in integrating devices and software into teachers’ daily lessons.
All of these sources intersected and overlapped permitting me to assess the degree to which technology integration occurred. Defining what the concept of “technology integration,” however, was elusive and required much work. Even though when I used the phrase it triggered nods from teachers and administrators as if we all shared the same meaning of the phrase. I still had to come up with a working definition of the concept that would permit me to capture more precisely what I saw in classrooms, schools, and districts.
______________________________
[i] The protocol is straightforward and subjective. I write out in longhand or type on my laptop what teachers and students do during the lesson. Each sheet of paper or laptop screen is divided into a wide column and a narrow column. In the wide column I record every few minutes what the teacher is doing, what students are doing, and teacher-directed segues from one activity to another. In the narrow column, I comment on what I see.
Subsequent posts will deal with defining technology integration, common models describing its stages, and determining success of technology integration.
|
Nenhum comentário:
Postar um comentário