Posted December 1, 2013
National average scores of students on the 2012 Program for International Student Assessment (PISA) will be released Tuesday, and we urge commentators and education policymakers to avoid jumping to quick conclusions from a superficial “horse race” examination of these scores.
Typically, The U.S. Department of Education (ED) is given an advance look at test score data by the Organization for Economic Cooperation and Development (OECD) and issues press releases with conclusions based on its preliminary review of the results. The OECD itself also provides a publicized interpretation of the results. This year, ED and the OECD are planning a highly orchestrated event, “PISA Day,” to manipulate coverage of this release.
It is usual practice for research organizations (and in some cases, the government) to provide advance copies of their reports to objective journalists. That way, journalists have an opportunity to review the data and can write about them in a more informed fashion. Sometimes, journalists are permitted to share this embargoed information with diverse experts who can help the journalists understand possibly alternative interpretations.
In this case, however, the OECD and ED have instead given their PISA report to selected advocacy groups that can be counted on, for the most part, to echo official interpretations and participate as a chorus in the official release.1 These are groups whose interpretation of the data has typically been aligned with that of the OECD and ED—that American schools are in decline and that international test scores portend an economic disaster for the United States, unless the school reform programs favored by the administration are followed.
The Department’s co-optation of these organizations in its official release is not an attempt to inform but rather to manipulate public opinion. Those with different interpretations of international test scores will see the reports only after the headlines have become history.
Such manipulation in the release of official government data would never be tolerated in fields where official data are taken seriously. Can you imagine the Census Bureau providing its poverty data in advance only to advocacy groups that supported the administration, and then releasing its report to the public at an event at which these advocacy groups were given slots on a program to praise the administration’s anti-poverty efforts? What if the Bureau of Labor Statistics gave its monthly unemployment report in advance to Democrats, but not to Republicans, and then invited Democratic congressional leaders to participate in the official release?
In actuality, international data are complex, and even a day or two’s advance look at a summary report would be insufficient to make an intelligent evaluation. It takes many months for careful scholars to analyze the data. Sometimes, this analysis requires examination of more detailed data, including disaggregated scores by social class, gender or race. These are eventually available on the testing organization’s website, but often considerably after the initial public release of a government summary report. Careful analyses of these detailed data can often undermine early assertions.
In January, we published an analysis based on such detailed examination of the previous round of PISA data, from assessments administered in 2009 (What Do International Tests Really Show about U.S. Student Performance?). Our analysis showed that conventional interpretations of these scores can be glib and misleading. Our chief conclusion was that an accurate interpretation of these scores cannot easily be reduced to the kinds of sound bites favored by many commentators and education policymakers.
We have prepared a PowerPoint summary (available for download here) of our conclusions from that publication, based on the most recent test results available for study—not only the 2009 PISA, but also the 2011 Trends in International Mathematics and Science Study (TIMSS), and the domestic 2011 National Assessment of Educational Progress (NAEP). We plan to conduct careful analyses of the latest round of tests, including new state-level results, early in 2014, once detailed data are made available to researchers. Between the 2009 and 2012 PISA administrations, students everywhere experienced effects from the worldwide recession. We will be particularly interested in examining whether this may have affected performance, and if so, whether this effect varied among countries.
In our report last January, we identified several reasons for the complexity of international comparisons and came to some key conclusions:
- There is a test score gap between socioeconomically advantaged and disadvantaged students in every country. Although the size of the gap varies somewhat from country to country, countries’ gaps are more similar to each other than they are different.
- Countries’ average scores are affected by the relative numbers of advantaged and disadvantaged students in their schools. The United States has relatively more disadvantaged students than the usual comparison countries. If average scores were adjusted so that each country had a similar social class composition, U.S. scores would appear to be higher than conventionally reported and the gap with top-scoring countries, while still present, would be smaller. Adjusting for differences in countries’ social class composition can also change their relative rankings.
- Trends in test scores over time vary more by social class in some countries than in others. In the United States, there have been striking gains over the last decade for disadvantaged students, but not for the more advantaged.
- In some countries, including the United States, students perform relatively better on some international tests than others, even though the tests purport to assess the same subject area. This may be because of flaws in one test or another, or because some tests are better aligned with a particular country’s curriculum than others.
- Some countries to which the United States is frequently unfavorably compared currently have higher test scores, but their test scores have been falling over time, while scores in the United States have not been similarly falling. It is not apparent to what extent U.S. policymakers should attempt to learn from the experience of countries with high scores, or from the experience of countries with rising scores.
Our report further challenges the conventional view that U.S. performance on an international test has serious consequences for the nation’s future prosperity. Certainly, a country needs a sufficient number of highly educated workers. However, there is little reason to believe that the United States is not now at that sufficiency level, or that continued growth in educational credentials is necessary to ensure we remain there.
Advocates participating in Tuesday’s staged PISA Day release include several who, a quarter century ago, warned that America’s inadequate education system and workforce skills imperiled our competitiveness and future. Their warnings were followed by a substantial acceleration of American productivity growth in the mid-1990s, and by an American economy whose growth rate surpassed the growth rates of countries that were alleged to have better prepared and more highly skilled workers.
Today, threats to the nation’s future prosperity come much less from flaws in our education system than from insufficiently stimulative fiscal policies which tolerate excessive unemployment, wasting much of the education our young people have acquired; an outdated infrastructure: regulatory and tax policies that reward speculation more than productivity; an over-extended military; declining public investment in research and innovation; a wasteful and inefficient health care system; and the fact that typical workers and their families, no matter how well educated, do not share in the fruits of productivity growth as they once did. The best education system we can imagine can’t succeed if we ignore these other problems.
We don’t plan to comment on tomorrow’s release, except to caution that any conclusions drawn quickly from such complex data should not be relied upon. We urge commentators to await our and other careful analyses of the new PISA results before accepting the headline-generating assertions by government officials and their allies upon the release of the national summary report.
Endnotes
1. The organizations who have been provided with advance copies of this government report, and that are participating in the public release are: The Alliance for Excellent Education, Achieve, ACT, America Achieves, the Asia Society, the Business Roundtable, the Council of Chief State School Officers, the College Board, the National Board for Professional Teaching Standards, and the National Center on Education and the Economy. These organizations and their leaders have a history of bemoaning Americans’ performance on international tests and predicting tragic consequences for the nation that will follow.
Nenhum comentário:
Postar um comentário