Content-type: text/html Downes.ca ~ Stephen's Web ~ Places to Go: PISA

Stephen Downes

Knowledge, Learning, Community

Apr 01, 2008

This Journal Column published as Places to Go: PISA in Innovate Volume 4, Number 4 Apr 01, 2008. The Fischler School of Education and Human Services at Nova [Link] [Info] [List all Publications]

Sponsored by the by the Organisation for Economic Co-operation and Development (OECD), the Programme for International Student Assessment (PISA) is a set of tests administered every three years to 15-year-olds around the world. These tests measure students' practical ability in basic skills such as reading, mathematics, and science. Nations are ranked on "league tables" summarizing the results. The front page of OECD's PISA Web site states the rationale behind the tests:

    Are students well prepared for future challenges? Can they analyse, reason and communicate effectively? Do they have the capacity to continue learning throughout life? The OECD Programme for International Student Assessment (PISA) answers these questions and more, through its surveys of 15-year-olds in the principal industrialised countries. (PISA 2007a, ¶1)

PISA has been in the news recently because the 2006 test results were released in December. Although the tests also assess mathematics and reading, this year's report focused on students' understanding of science. Two volumes are available as a free PDF download; the first provides an analysis of the results, and the second offers the full data set produced by the 2006 study. An additional volume describing the framework for the assessment is also available.

The country rankings appear to be a very important part of the PISA methodology and are emphasized in the results. The highest-scoring countries-including Finland, Canada, Japan, and New Zealand-are identified. And the authors report that "1.3% of 15-year-olds reached Level 6 of the PISA 2006 science scale, the highest proficiency level. These students could consistently identify, explain and apply scientific knowledge, and knowledge about science, in a variety of complex life situations" (PISA 2007d, "Key Findings," ¶3).

The rankings were also emphasized in the press following the release of the report and were accompanied with exultation by those countries that scored higher and an almost anguished soul-searching by those nations that scored more poorly. In the wake of Britain's 17th-place finish in reading and 24th-place showing in mathematics, a ranking that made "Britain equal to Poland" (Grimston 2007, ¶15), the Times reported on a school system "battered by a succession of surveys that showed an alarming decline in national educational attainment" (Grimston 2007, ¶7).

In the United States, the reaction to disappointing results on the 2006 tests was that "something needs to be done now" (eSchool News 2007, ¶23). According to eSchool News, "The issue is not that U.S. students did so poorly on the exam; it's that other countries have made significant strides in the last few years" (¶6). In an Education Week report, one researcher attributes the results, at least in part, to teacher education: "Our future teachers are getting weak training mathematically and are just not prepared to teach the demanding mathematics curriculum we need for middle schools," according to William Schmidt (Manzo 2007, ¶5).

Results from previous years are also available on the PISA Web site, but not through the odd selection of items found in the News & Events and News Archive sections, such as Denmark's reaction to the 2000 results. To see reports on previous years' results, click on "What PISA Produces" in the left-column menu of the home page, then select a year.

The list of publications for the 2003 study reveals a set of reports similar to that produced for the 2006 results, with a framework, data, and analysis all available. Some additional studies, such as Are Students Ready for a Technology-Rich World? What PISA Studies Tell Us, give a hint at what we may expect from PISA in the months to come. It is worth noting that the country tables from 2003, presented in the executive summary (PISA 2004, 8, 9, 33, 35, 36), look remarkably similar to those from 2006 (PISA 2007b, 20, 22, 47, 53). (Note that download speeds for these reports may be slow.) And again, the data from the 2000 tests show the same countries on top: Finland, Canada, Japan, and New Zealand, among others (PISA 2001, 53, 79, 88). The United States, in 2000 as well as in 2003 and in 2006, is in the middle of the pack.

The PISA tests raise two sets of questions: First, what is being measured by tests of this sort? And second, what can the results of these tests tell us about the design of our educational system?

The first set of questions is brought into focus through a comparison between PISA and similar tests conducted by other agencies. For example, in an overview of four international assessments-which included PISA and the Trends in Math and Science Study (TIMSS) along with two international assessments of reading and literary skills-the Center for Public Education noted that "both PISA and TIMSS test mathematics, but PISA measures students' ability to apply mathematical knowledge and skills to real life experiences while TIMSS measures the mathematical skills students have obtained specifically from the school's curriculum" (Hull 2007, "What is being assessed?" ¶1).

The PISA Frequently Asked Questions (FAQ) page clarifies: "Rather than examine mastery of specific school curricula, PISA looks at students' ability to apply knowledge and skills in key subject areas and to analyse, reason and communicate effectively as they examine, interpret and solve problems" (PISA 2007c, "What makes PISA unique?" ¶4). But in deliberately diverging from curriculum, the PISA authors open themselves to criticism that the standards being examined are not relevant. As Prais (2003) writes of the 2003 mathematics tests, "Answering such questions correctly may be more a test of 'common sense,' or of 'IQ,' than the results of mathematical schooling at this age" (143).

This raises a similar concern regarding the use of the same evaluation criteria to compare two school systems that may have very different objectives. Prais (2003) comments that the objectives of the German school system may be very different from those of the British system. And it may be premature to assess students' preparedness for life in school systems where a 15-year-old is only partway through his or her education, while using the same framework to assess students in other systems where the leaving age may be earlier.

Finally, it is not clear that the measurement of a student's ability to complete a test is a relevant assessment of learning at all. Indeed, a comment from Tharman Shanmugaratnam, Singapore's education minister, suggests that these international assessments cannot measure the very attributes that some nations perceive as America's strengths:

[America's] is a talent meritocracy; ours is an exam meritocracy. There are some parts of the intellect that we are not able to test well-like creativity, curiosity, a sense of adventure, ambition. Most of all, America has a culture of learning that challenges conventional wisdom, even if it means challenging authority. These are the areas where Singapore must learn from America. (Gardner 2008)

There is, as Shanmugaratnam points out, no test to measure innovation, ambition, or creativity, attributes which may be just as important to a student's-and a nation's-success.

Just as questions may be raised about the appropriateness of the testing, questions may also be raised about the solutions suggested by the results of PISA or any other test. This was brought into sharp focus after the 2003 study. In an analysis of the survey data, Woessmann and Fuchs (2004) suggest that the use of computers by students actually decreases learning. "This may reflect the fact that computers at home may actually distract students from learning," they write, "both because learning with the help of computers may not be the most efficient way of learning and because computers can be used for other aims than learning" (16). While this immediately cast doubts on the value of computers in learning (BBC News 2004), Woessmann and Fuchs's conclusion ought to be questioned.

The 2006 PISA results have produced similar speculations about what ought to be done to fix the school system. The Economist, for example, moved almost immediately to dissociate spending from results. And its authors have very clear intuitions about what factors link nations with higher test scores:

Letting schools run themselves seems to boost a country's position in this high-stakes international tournament: giving school principals the power to control budgets, set incentives and decide whom to hire and how much to pay them. Publishing school results helps, too. More important than either, though, are high-quality teachers: a common factor among all the best performers is that teachers are drawn from the top ranks of graduates. (2007, ¶7)

But is this the case? Certainly, such a strategy would serve the objective of separating the education agenda from the interference of "meddlesome governments" (The Economist 2007, ¶9). But it seems that the nations that do well on the tests year after year-Finland, Japan, Singapore, Canada-are characterized precisely by large governmental investment in-and interest in-the educational system. And there seems to be no evidence to suggest that school autonomy, deunionization, and testing programs result in improvements in test outcomes.

This points to what may be the most substantial difficulty inherent in using test results to inform educational policy decisions: The decisions are underdetermined by the data. It is likely that the high scores from the successful nations are the result of numerous factors, some of which may be unique to the nation in question. Could it not be argued, for example, that the strong Canadian showing is the result of the constructivist teaching practices in Quebec (Lenoir 2006), and not any of the factors mentioned by The Economist at all?

References

BBC News. 2004. Doubts about school computer use. BBC News, November 24. http://news.bbc.co.uk/2/hi/uk_news/education/4032737.stm (accessed February 26, 2008).

Economist. 2007. The race is not always to the richest. The Economist, December 6. http://www.economist.com/world/international/displayStory.cfm?story_id=10251324&fsrc=nwlgafree (accessed February 26, 2008).

eSchool News. 2007. ". . . Something needs to be done now": Poor showing on international exam prompts calls for better science instruction. eSchool News, December 7. http://www.eschoolnews.com/news/top-news/news-by-subject/international/?i=50970;_hbguid=82c500e1-7421-4c66-b259-472863966744 (accessed February 26, 2008).

Gardner, W. 2008. International school rankings don't reflect many factors. Commentary. Ithaca Journal, January 2.

Grimston, J. 2007. The three Rs-Really rotten results? TimesOnline, December 9. http://www.timesonline.co.uk/tol/news/uk/education/article3022579.ece (accessed February 26, 2008).

Hull, J. 2007. More than a horse race: A guide to international tests of student achievement. Center for Public Education. http://www.centerforpubliceducation.org/site/c.kjJXJ5MPIwE/b.2422943/k.3608/More_than_a_horse_race_
A_guide_to_international_tests_of_student_achievement.htm
(accessed February 26, 2008).

Lenoir, Y. 2006. Practices of disciplinarity and interdisciplinarity in Quebec elementary schools: Results of twenty years of research. Journal of Social Science Education 2:19-36. http://www.jsse.org/2006-2/lenoir_practices_engl.htm (accessed February 26, 2008).

Manzo, K. 2007. U.S. middle-grades teachers found ill-prepared in math. Education Week, December 14. http://www.edweek.org/ew/articles/2007/12/19/16middle.h27.html (accessed February 26, 2008). [Editor's note: Access to this article requires a paid subscription.]

PISA. 2007a. Home page. http://www.pisa.oecd.org/pages/0,2987,en_32252351_32235731_1_1_1_1_1,00.html (accessed February 26, 2008).

PISA. 2007b. Executive summary: Science competencies for tomorrow's world. http://www.pisa.oecd.org/dataoecd/15/13/39725224.pdf (accessed February 26, 2008).

PISA. 2007c. FAQ: OECD PISA. http://www.pisa.oecd.org/document/53/0,3343,en_32252351_32235731_38262901_1_1_1_1,00.html
(accessed February 26, 2008).

PISA. 2007d. PISA 2006 results. http://www.pisa.oecd.org/document/2/0,3343,en_32252351_32236191_39718850_1_1_1_1,00.html
(accessed February 26, 2008).

PISA. 2004. First results from PISA 2003: Executive summary. http://www.pisa.oecd.org/dataoecd/1/63/34002454.pdf (accessed February 26, 2008).

PISA. 2001. Knowledge and skills for life: First results from the OECD programme for international student assessment (PISA) 2000. http://www.pisa.oecd.org/dataoecd/44/53/33691596.pdf (accessed February 26, 2008).

Prais, S. J. 2003. Cautions on OECD's recent educational survey (PISA). Oxford Review of Education 29 (2): 139-163. http://www.oecd.org/dataoecd/29/48/33680693.pdf (accessed February 26, 2008).

Woessmann, L., and T. Fuchs. What accounts for international differences in student performance? A re-examination using PISA data. IZA Discussion Paper No. 1287; CESifo Working Paper Series No. 1235. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=572802#PaperDownload (accessed February 26, 2008).



Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Oct 06, 2024 02:31 a.m.

Canadian Flag Creative Commons License.

Force:yes