Published Online: June 27, 2011
Education Week
Putting Virtual Assessments to the Test
With everyone from the nation’s top CEOs to President Obama stressing the importance of science, technology, engineering, and mathematics, or STEM, learning in order to prepare students for a competitive 21st-century workforce, we need better measures of how well students are mastering those subjects. Science and other complex subjects are not served well by conventional testing; answering A, B, C, D, or “all of the above” doesn’t lend itself to measuring science proficiency, scientific thinking, or deeper knowledge and understanding.
While traditional paper-and-pencil testing gauges student knowledge on distinct facts or concepts, virtual performance assessments allow students to actually use scientific inquiry and problem-solving through interactions with their virtual environments. In a VPA, students, represented by computer-generated icons, or avatars, make a series of choices. They tackle authentic science problems, investigate causal factors, and choose which virtual experiments to conduct in a virtual lab. The assessment is no longer focused on a single right answer, but on the result of decisions and knowledge applied by the student. This approach allows a finer measure of students’ understanding and provides a truer assessment of what students know and don’t know about complex science content.
An exciting VPA model is being developed and tested by Jody Clarke-Midura and Christopher Dede at the Harvard Graduate School of Education, with funding from the federal Institute of Education Sciences. The goal of the VPA project is to provide all states with a new model of statewide assessment in the form of valid technology-based performance assessments linked to the National Science Education Standards for middle schoolers.
In the Dede and Clarke-Midura model, a student logs on to a computer, selects an avatar, and enters the virtual world of a science experiment. She’s given an aerial view of the space, in this case a farm with several ponds. The camera then focuses on a six-legged frog, which prompts the student to wonder what could be causing such a mutation. The assessment then begins with several farmers offering the student various hypotheses about the cause of the mutation. The student is told to select her own explanation and back it up with evidence. To do this, the student must consider a hypothesis, make decisions about the type of data that would support her claim, decide whether to consult prior research, collect evidence in a virtual backpack, examine data, modify her ideas as necessary, and, finally, make an evidence-supported claim.
“While traditional paper-and-pencil testing gauges student knowledge on distinct facts or concepts, virtual performance assessments allow students to actually use scientific inquiry and problem-solving.”
Students with varying proficiency will, of course, take different approaches. The beauty of the VPA model is that it’s perfectly suited for such different problem-solving strategies. A virtual performance assessment can gauge how well students reason based on available evidence. It can reveal how students gather evidence and whether they select data that are relevant or irrelevant to the hypothesis they are investigating, something a paper-and-pencil test cannot do. With VPAs, educators can literally track and analyze the trajectory of students’ thinking and generate reports that provide such data in the form of feedback to both teachers and students.
By 2014, the states that have adopted the common-core standards will be expected to implement new computer-based assessments. They’ll have to decide whether to go with simple, digitized versions of paper-and-pencil tests, or to embark on the far more complex world of VPAs. Virtual performance assessments cost more to develop, but they do not cost more to administer, and they offer a greater payoff.
VPAs provide a detailed record of student actions. Even essay questions on traditional tests can’t compare with the realistic context of VPAs for mimicking the steps required for legitimate scientific inquiry. And such assessments are largely immune to the practice of teaching test-taking strategies that can distort results of multiple-choice assessments. If teachers “teach to the test” with a VPA, they will actually be providing relevant and rigorous instruction. Moreover, because VPAs can adjust the available evidence (and therefore the valid conclusions) of each scenario for different test administrations, strict test security is not a great problem.
A logical question is whether these tests are biased toward video and computer gamers. The Harvard researchers are testing that, too, and they note that prior research in virtual immersive environments showed no correlation between computer-gaming experience and performance in the curriculum. At scale, these virtual assessments are much more practical and cost-effective than hands-on performance assessments and are on a par with other forms of computer-based testing.
In “The Road Ahead for State Assessments,” a report released in May, the Rennie Center for Education Research and Policy and the group Policy Analysis for California Education urge state education leaders to give serious consideration to implementing VPAs, especially in science. They offer the following recommendations for practical, scalable implementation of such assessments as part of comprehensive state assessment systems. State education agencies should:
- Provide teachers and students with the opportunity to try out virtual performance assessments so they get comfortable with the technology, so that the technology is not a barrier to demonstrating knowledge;
- Provide teachers with professional development to foster instruction that will lead to high performance on these assessments;
- Similarly, provide opportunities for parents, school boards, and community members to try their own VPA investigation to alleviate fears that this new teaching and testing approach promotes playing video games in the classroom; and
- Support the infrastructure to do it right, ensuring that the devices and networks deployed can fully deliver the features that make VPAs stand out as a student-assessment tool.
There are a number of ways to integrate technology into the classroom to improve teaching and learning. Virtual performance assessment is one use of technology that could yield great results. VPAs are already changing the way assessments are executed in medicine and the military. Why not in education? If we’re serious about the importance of STEM learning and adequately preparing the next generation of students for real-world careers and decisions, leveraging technology to better assess students’ knowledge can help pave the way.
Nenhum comentário:
Postar um comentário