Christopher H. Tienken
Cross posted from AASA Journal of Scholarship and Practice
Find the entire article here: http://tinyurl.com/ly4ducr
Pundits and bureaucrats use the results from international tests, particularly the PISA, to make claims about the quality of the public education system in the United States and make policy recommendations. In this article I argue, with evidence, that the scores and rankings from PISA are not important and that they cannot give policy makers or educators meaningful insights into student preparedness for the global economy.
Key words: PISA, International Testing, International Comparisons
The U.S. Secretary of Education, Arne Duncan, warned that the U.S. public education system was in a state of stagnation following the December 3, 2013, release of the 2012 Programme for International Student Assessment (PISA) results. Duncan (2013) proclaimed:
“The PISA is an important, comparative snapshot of U.S. performance because the assessment is taken by 15-year-olds in high schools around the globe. The big picture of U.S. performance on the 2012 PISA is straightforward and stark: It is a picture of educational stagnation. That brutal truth, that urgent reality, must serve as a wake-up call against educational complacency and low expectations.”
How important are PISA results? I dispensed with the fraudulent claims by bureaucrats and pundits of educational stagnation in previous articles and books (Tienken, 2011; 2013a; 2013b; Tienken & Orlich, 2013). Hence, I do not allocate many words for the topic here. In this article I argue that the scores and rankings from PISA are not important and that they cannot give policy makers or educators meaningful insights into student preparedness for the global economy.
Importance of PISA
Why would the results from one test, even a so-called international test of academic achievement, be important to the largest economy on the planet and third most populous nation? According to bureaucrats like Arne Duncan (2013) and some education pundits (Hanuschek & Woessman, 2008), the rankings from PISA equate to or even predict national economic fortunes. It seems to some, that the economic fate of nations hangs on PISA rankings. As Duncan (2013) exclaimed:
“In a knowledge-based, global economy, where education is more important than ever before, both to individual success and collective prosperity, our students are basically losing ground. We’re running in place, as other high-performing countries start to lap us.”
Duncan insinuates that the rankings from the PISA test provide important information about the quality of a country’s education system related to preparedness for the knowledge-based, global economy. In essence, according to the bureaucrats and pundits that use PISA results to make or suggest education policies, the PISA test rankings and scores (1) are a proxy for the overall education quality of a country, (2) quantify how prepared 15-year-olds are to compete in the global economy, and (3) predict future economic prosperity at the country level. But what does PISA say about PISA in terms of what the rankings and scores can and cannot tell about a nation’s education system or future economic success?
What PISA Says Regarding Its Ability to Judge Quality
I wrote previously about some comments by PISA researchers (see Tienken, 2013c) regarding the appropriate use of the results as a proxy for education quality, and I use and expand upon that work in this article. The Organisation for Economic Co-operation and Development (OECD, 2013, p. 265), the private entity that develops and vends the PISA, explains that policy makers should not use results either to indict or commend education systems. Furthermore, they should not use the results to make important policy decisions. In fact, the OECD authors explain that PISA results are due to a combination of variables, including but not limited to schooling, life experiences/home environment, poverty, access to early childhood programs, and health.
If a country’s scale scores in reading, scientific or mathematical literacy are significantly higher than those in another country, it cannot automatically be inferred that the schools or particular parts of the education system in the first country are more effective than those in the second. However, one can legitimately conclude that the cumulative impact of learning experiences in the first country, starting in early childhood and up to the age of 15, and embracing experiences both in school, home and beyond, have resulted in higher outcomes in the literacy domains that PISA measures (p. 265).
Additionally, the OECD authors (2013) reported that parents’ education level accounted for 23% of the 2012 mathematics score (p. 34).
Although bureaucrats and pundits like to dismiss poverty as just another excuse by educators for poor performance, the information in the PISA technical manuals suggests otherwise. Poverty explains up to 46% of the PISA mathematics score in OECD countries (OECD, 2013, pp. 35-36), the United States being one of those countries. The strong relationship between poverty and test results does not help the United States shine on the PISA. Remember that the United States has one of the highest childhood poverty rates of the major industrialized countries (OECD, 2009, p. 26). Approximately 22% of our public school children lived in poverty in 2012 compared to 15.6% in 2000 (Snyder, 2011, Table 27). In 2010, almost 48% of public school children qualified for either free or reduced lunch (Snyder, 2011, Table 45). The United States ranks 26th out of 29 industrial countries in overall well being of children, just ahead of Lithuania, Latvia,
and Romania, but behind countries like Estonia, Hungary, and Slovakia (UNICEF, 2013, p. 2).
We can gain a glimpse of what the U.S. mathematics scale score and rank might be if we had only 15% child poverty compared to the 23% nationally. Massachusetts (MA) bureaucrats spent taxpayer money administering PISA to a representative population of their students. Just as Tirozzi (as cited in Riddle, 2010) demonstrated with the results from the PISA 2009 tests, the U.S. rankings and scores change when the data are disaggregated by poverty rates. Students in schools with less than 10% of the students in poverty ranked and scored at the top of the world.
As I did with the TIMSS scores in 2012, I used the 2012 PISA math score and ranking from Massachusetts to model what the scores of students from a less poor America might look like on the PISA tables. Although 15% poverty is higher than almost all the countries that outranked the United States, it does provide a concrete example of the influence of poverty on PISA results and provides insight as to how the U.S. students might score if fewer of them lived in poverty.
Students in Massachusetts scored 520 on the mathematics portion. That score moves the United States from 29th to 12th, one point behind Estonia. If one disregards the non-representational cities that take PISA (Hong Kong, Macao, Shanghai) because their testing populations do not represent the country of China, the United States moves into 9th place, hardly a crisis situation. The other countries that outrank the United States, including Switzerland, Lichtenstein, Netherlands, Japan, Korea, and Singapore, all have lower levels of child poverty than 15%.
The Mathematical PISA Connection to Poverty
Poverty not only explains a large percentage of the PISA results, it also relates to important student attributes that further influence achievement. Poverty relates to mathematical self-efficacy on the PISA, and self-efficacy relates strongly to mathematics achievement with a correlation of .5 (OECD, 2013b, p. 83) On average, 28% of the variance in PISA mathematics results can be explained by self-efficacy. In the United States the difference between students with high self-efficacy in math and those with low self-efficacy is approximately 50 scale score points (OECD, 2013b, p. 86). Poverty also relates to math anxiety. Poorer students have more anxiety about math. Like self-efficacy, anxiety relates to achievement and accounted for an average of 14% of the variance in math scores (OECD, 2013b, p. 87).
Some might question why I do not include the Chinese cities that are part of PISA in my analyses. I remove Hong Kong and Macao from international testing samples because their testing samples do not represent the country of China. They are special administrative regions of the People’s Republic of China, and their schools do not follow all of the standardization requirements of the Chinese system (Levin, 2012). I remove Shanghai because it is a city of almost 23 million people and home to almost 140,000 millionaires, making it the city with the third highest concentration of wealth in China. The population is highly educated and international. Approximately 83.8% of the high school seniors in Shanghai continued on to attend college in 2008 according to the Shanghai.gov (2013) official website. Compare that to less than 25% of all high school graduates nationally in China (Loveless, 2013). The wealth and family demographics of Shanghai simply do not approximate those of the country of China, where 29% of the population, more than 392 million people, live on $2 a day or less (World Bank, 2012). That is more people than the entire population of the United States.
High school is not free in China. Only the students whose parents can afford to pay are in school at age 15. That limits the testing pool severely, even in Shanghai. Also, not all children who live in Shanghai are allowed to attend high school there, especially if those children are poor. Some of the poorer children are required to attend high school in their ancestral provinces and not permitted in the Shanghai schools (Loveless, 2013). Do not expect to see many students with special needs in Shanghai or Chinese high schools in general. Many are not in school by age 15 (Ringmar, 2013).
Education prospects are even worse in the rural areas, and the statistics provide more evidence as to why the results from Shanghai should not be considered in analyses. According to the Rural Education Action Program (REAP, 2013a), only approximately 40% of rural children attend high school in China (REAP, 2013b) and between only 35-45% of students graduate from high school in China, not to mention that 25% of middle school students drop out before entering grade nine (REAP, 2013a). When “China” starts taking the PISA, then I will include “China” in the testing samples for calculating ranks. Right now, we basically have the general-education Beverly Hills version of China, masquerading as the nation of China, taking the PISA test.
Read the entire article here: http://tinyurl.com/ly4ducr