(Cross-posted from the AASA Journal of Scholarship and Practice, Winter 2013. AASA.org)
The International Association for the Evaluation of Educational Achievement (IEA) released the latest results from the Trends in Mathematics and Science Study (TIMSS) on December 11, 2012. Secretary of Education Arne Duncan proclaimed (2012):
Given the vital role that science, technology, engineering, and math play in stimulating innovation and economic growth, it is particularly troubling that eighth-grade science achievement is stagnant and that students in Singapore and Korea are far more likely to perform at advanced levels in science than U.S. students. A number of nations are out-educating us today in the STEM disciplines—and if we as a nation don’t turn that around, those nations will soon be out-competing us in a knowledge-based, global economy (p. 1).
What do the rankings suggest about student achievement in the United States (US) and are other countries like Singapore and Korea, or cities like Hong Kong really going to outpace the US in the global economy? In this article I untangle some of the results so that education administrators and bureaucrats might be able to make better sense of them.
Drawing Conclusions
I expected the results from TIMSS 2011 to show U.S. students ranked in the middle of the international pack in mathematics and science based the amount of rhetoric and press extolling the supposed failure of the U.S. public schools. A day rarely passes when someone does not write or say that the entire U.S. public school system is failing, especially in the areas of Science, Technology, Engineering, and Mathematics (STEM). Secretary Duncan’s comments insinuate that the TIMSS 2011 results in some way reflect a lack of STEM preparedness.
Secretary Duncan seems to draw a cause and effect conclusion between the TIMSS 2011 results and the future of STEM and economic competitiveness in the third most populace country. However, I could not find any connection between the results and the future of STEM or economic vitality described or explained in the TIMSS report or technical manuals. Apparently the developers of TIMSS did not create the assessment to allow for such conclusions. A deeper exploration of the data provides more insight to what one can infer from the TIMSS 2011 outcomes.
The Numbers
The results from TIMSS 2011 present a possible conundrum for those who use results from international testing to assail public education. U.S. students ranked high within the sample of countries and cities. In science, Grade 4 students in the U.S. sample ranked 7th out of 53 participating countries and international cities. The Grade 4 students ranked higher than approximately 87% of the students in the international sample. Grade 8 students ranked 9th out of 45 participating countries and international cities, or higher than approximately 80% of the students in the sample. U.S. students ranked 6th and 8th in Grades 4 and 8 when I eliminate Hong Kong because it is a semi-autonomous city and does not represent the Chinese education system.
In mathematics, Grade 4 students in the U.S. sample ranked 8th out of 53 participating countries and international cities; tied with Finland. Grade 8 students ranked 7th out of 53 participating countries and international cities. Grade 4 students outranked approximately 85% of the students in the sample and Grade 8 students outranked approximately 87% of the students in the sample (See Table 1).
Table 1
TIMSS 2011 Mathematics and Science Rankings for the US, Grades 4 and 8
Subject > Rank Grade 4 > Rank Grade 8
Mathematics > 8th > 7th
Science > 7th > 9th
The mathematics results for Grade 8 students interest me because 30% of the questions on the TIMSS 2011 contain algebra concepts such as functions and solving equations. Readers should keep in mind that not all students in the US complete Algebra I in eighth-grade, but most students do complete it by the time they graduate high school.
The creators of TIMSS acknowledge that there is a curricular mismatch on the test for some nations. Hence it is hard to judge student knowledge on topics that they have not yet been formally instructed. The skills being assessed at Grade 8 on the TIMSS might not be secured until the American student completes Grade 9 or 10, so the TIMSS Grade 8 snapshot of achievement could be a little out of focus.
Some education bureaucrats, like Secretary Duncan, might say after reading this article: “Wait, U.S. fourth-grade students ranked 11th in mathematics, not 8th.” That is true if the Secretary and other bureaucrats disregard statistical significance. For example, there is only a one-point difference between the fourth-grade U.S. mathematics scale score and the scale score of Russian fourth-grade students; 541 and 542 respectively. It is possible to use statistics to determine if the difference in mean scores is significant or if it occurs by chance. The researchers at IEA provide that statistical information in their report to help the Secretary and others who might not be familiar with statistical techniques to make better sense of the results.
The TIMSS report contains other important information about things such as student confidence in mathematics. Zhao (2012a) presents a comprehensive analysis of the TIMSS 2011 and PISA 2009 results from another angle. I suggest readers consider his points in light of the results from those assessments and review the reports themselves.
Economic Realities
I cautioned education bureaucrats previously about generating crises over the results from international tests, especially in terms of which country’s students outrank U.S. students. As I explain in my forthcoming book with Don Orlich, The School Reform Landscape: Fraud, Myth, and Lies, one must compare apples to apples when attempting to make international comparisons (Tienken & Orlich, 2013). In this case the apples represent the overall size of a country’s economy. I present in rank order, the countries with the 10 largest economies in the TIMSS sample, based on their 2011 gross domestic product (GDP), as if they were the only countries in the sample (See Table 2). The table is based on rank only and does not take into account statistical significance (The US would rank even higher if we considered significance). The list is ordered by GDP size. The US never ranks below 5th in either math or science TIMSS achievement within its comparison group. I left Hong Kong in the sample as a GDP placeholder for China. If I remove it, the US rankings improve in all areas except Grade 4 science.
Table 2
TIMSS 2011 Mathematics and Science Ranks for the 10 Largest Economies
Country Science Gr. 4 Science Gr. 8 Math Gr. 4 Math Gr. 8
U.S. 3 5 5 4
Hong Kong1 4 3 1 1
Japan 1 1 3 2
Germany 6 6 6 DNP
France DNP DNP DNP DNP
Brazil DNP DNP DNP DNP
England 5 4 5 5
Italy 7 7 6 6
Russia 2 2 3 3
India DNP DNP DNP DNP
1 Only Hong Kong participated. The city does not represent the Chinese education system but it is used as a placeholder for the Chinese economy in this table.
DNP = Did not participate
I used GDP as a proxy for the group with which the US competes because it is nonsensical to think that smaller economies like Singapore, Finland, Northern Ireland, Slovenia, Taipei, or Korea will surpass the US in GDP or innovation. They simply do not have enough workers to outperform the larger economies.
For example, according to data from World Intellectual Property Organization (WIPO), there are very few countries that compete with the US in terms of innovation as measured by the number of patents approved in the areas of (a) electrical engineering, (b) technological instrumentation, (c) chemistry, (d) mechanical engineering, and (e) related fields.
I encourage readers to review the patent information for themselves in the report from WIPO titled, World Intellectual Property Indicators – 2012 Edition. I gathered the specific data on patents from Section A: Patents, Utility models and Microorganisms, Table A.7.1.2 Patent Applications Worldwide by Field of Technology.
Poverty
What about the influence of poverty on the test results? The secondary TIMSS sample, called the Benchmarking Participants, includes results from several states, including Massachusetts, Florida, and California. Just as Tirozzi (as cited in Riddle 2010) demonstrated with the results from the PISA 2009 tests, the rankings change when the data are disaggregated by poverty rates. I used the TIMSS 2011 scores from Massachusetts (MA) as a proxy for the scores from a less impoverished “US” national sample to model lower levels of child poverty.
According to The Annie E. Casey Foundation, the 2011 child poverty rate in MA was 15% whereas the rate for the U.S. was approximately 23%. Although 15% poverty is higher than many countries in the TIMSS sample, it does provide a method to look at the influence of poverty on TIMSS results and gives insight as to how U.S. students might score if less of them lived in poverty.
Grade 8 students in MA participated in the science and mathematics portions. In science, the MA students achieved a scale score of 567, second only to Singapore at 590 and ahead of such participants as Chinese Taipei, Japan, Hong Kong, Korea, and Finland; all of which have lower rates of childhood poverty. A decrease in the poverty rate by 8 percentage points (23% U.S. average—15% MA average) increases the U.S. scale score by 41 points and propels it to 2nd place in the world on TIMSS 2011 Grade 8 Science.
In mathematics, the MA students achieved a scale score of 561 compared to the U.S. average of 509: a difference of 52 scale score points. The difference propels the U.S. students into 5th place and on par with Japan. Poverty matters in the US in terms of scale scores on the TIMSS.
Good or Bad?
Some readers might misconstrue my comments as boasting about the performance of U.S. students, but my aim was to present a clearer picture of the data. My true feelings border more on concern than elation. I wonder if these results indicate that the U.S. system is creating better test takers at the expense of better innovators. Keep in mind the 2011 results are from samples of students who will not contribute to the economy for many more years.
The innovative products for the patents referred to in the previous section were created by students who probably took tests like the First International Mathematics Study (FIMS) in1964, the Second International Mathematics Study (SIMS) in 1982, and the Third International Mathematics and Science Study (TIMSS I) in 1995. The U.S. students in those samples ranked lower on their respective tests than the students in the TIMSS 2011 sample. In fact, U.S. students ranked next to last on the FIMS. As I discussed with Yong Zhao (personal communication, December, 2012), could the strong results on the TIMSS 2011 be an omen of a future creativity decline in U.S. students? Zhao (2012b) compiled multiple indicators of creativity and found a negative relationship between high scores on international tests and high levels of creativity in a country’s population.
Are U.S. students destined to become more like their high scoring, but less innovative peers in China, Singapore, and Korea; great test-takers but poor creators (Zhao, 2012b)? Will we start to see PISA and TIMSS study manuals show up on the shelves of grocery stores like they do in Singapore? Will there be TIMSS test preparation centers opening near your home in the coming years? Will there be a slow decline in the number of utility patents achieved by U.S. businesses and citizens due to a myopic focus on standardized testing and the accompanying lack of emphasis on creativity and divergent thinking in the U.S. curriculum and national standardized test?
Alternative Indicators
Other indicators such as the percentage of the population age 25-34 that attained at least a Tertiary degree, equivalent to the U.S. bachelors degree (BA), and a high school diploma suggest that the U.S. public school system is functioning well and has been for a long time. Regardless of ranks on international tests, the U.S. ranks 10th out of 42 nations from the Organisation of Economic Co-operation and Development (OECD) in the percentage of people with a BA degree, with 33% of 25-34 year-olds who attained the degree; the same as Japan (OECD, 2012).
There are only six other nations that out rank the US in percentage of their populations with BA degrees when one accounts for statistical significance: (a) Norway, 46%, (b) Korea, 39%, (c) Netherlands, 38%, (d) England, 38%, (e) Finland, 37%, and (f) Poland, 36%. China has approximately 2% of the population age 25-34 who completed at least a BA degree. According to international data collected by OECD (2012) 88% of the U.S. population age 24-35 attained a high school diploma. The average for the G20 group of countries is 72%. The U.S. ranks 10th out of 42 OECD countries. Korea, Poland, and Slovak Republic have the highest percentages with ranges from 98% and 94%.
The Wrong Problems
The problems faced by U.S. students are not how well they score or rank on international tests, or any other tests. Children in the US face much larger issues that detract from academic achievement. Why should U.S. students rank first in the world on any international tests of academic achievement?
• Is the US first in the world in terms of eradicating childhood poverty? No.
• Is the US first in the world in the percentage of children who have access to high quality healthcare? No.
• Are U.S. children the least prone to housing and food insecurity compared to the rest of the industrialized world? No.
• Does the US lead the world in universal access to high quality pre-school? No.
We can spend our time chasing meaningless rankings and competing for those rankings with countries that have populations no larger than Dallas/Fort Worth or Northern New Jersey, but that would be an offensive waste of taxpayer money. The results from international tests do not suggest a cause and effect relationship to economic strength or innovation.
I believe that in the US we should use our massive resources to enrich a unitary, democratic public school system to foster students to become resilient, persistent, creative, innovative, collaborative, empathetic, intrinsically motivated, socially conscious, globally/culturally aware, and critically thinking human beings.
According to the released items from the TIMSS battery of tests, TIMSS does not measure any of those traits; nor does PISA or PIRLS. Focusing too much on any single test is a recipe for fostering imitation, not creation and innovation.
References
Duncan, A. (2012, December, 11). Statement by U.S. Secretary of Education Arne Duncan on the Release of the 2011 TIMSS and PIRLS Assessments. Ed.gov. Retrieved from www.ed.gov/news/press-releases/statement-us-secretary-education-arne-duncan-release-2011-timss-and-pirls-assess
International Association for the Evaluation of Educational Achievement [IEA]. (2012). TIMSS 2011. Author. Retrieved from timssandpirls.bc.edu/timss2011/index.html
Organisation of Economic Co-operation and Development [OECD]. (2012). Education at a glance 2012. OECD. Retrieved from www.oecd.org/edu/EAG%202012_e-book_EN_200912.pdf
Riddle, M. (2010, December 15). PISA: It’s poverty not stupid. The Principal Difference.
nasspblogs.org/principaldifference/2010/12/pisa_its_poverty_ not_stupid_1.html
The Annie E. Casey Foundation. (2010). Data center: Kids count. Retrieved from datacenter.kidscount.org/data/acrossstates/Rankings.aspx?ind=43
Tienken, C.H. & Orlich, D.C. (2013). The school reform landscape: Fraud, myth, and lies. Lanham, MD: Rowman and Littlefield Education.
World Intellectual Property Organization. (2012). World Intellectual Property Indicators – 2012 Edition. Retrieved from www.wipo.int/ipstats/en/wipi/index.html
Zhao, Y. (2012b, December 11). Numbers can lie: What PISA and TIMSS truly tell us, if anything? Retrieved from zhaolearning.com/2012/12/11/numbers-can-lie-what-timss-and-pisa-truly-tell-us-if-anything/
Zhao, Y. (2012a). World class learners. Educating creative and entrepreneurial students. NY: New York, Corwin Press.