Data Can Misinform Us
Studies in education, like any field in the social sciences, do not have as much control as experiments do in the physical sciences. Chemists and physicists are required to ensure that conditions are fully defined and that experiments are designed to provide information that is both accurate and useful. Unfortunately, in the social sciences, where variables are so much more unmanageable, sufficient care is rarely exercised. More often than not, data are even selected to fit one's bias.
For instance, Mark Bray and Magda Nutsa Kobakhidze from the University of Hong Kong demonstrate in Comparative Education Review how a survey question could be misleading.
The above is an example of a survey question that fails miserably in providing any useful information for several reasons. First, the week is not specified. When a student answers this question, it is not clear whether one should take note of an average over the entire school year or report on a particular week. "Cramming" does connote time periods during which exams are held. Second, there is no specificity in the kind of extra lessons the student is receiving. Does this include doing homework with the guidance of a parent at home? Is this private tutoring? Does this happen inside school buildings? Are these review or remedial sessions provided by the school or an outside tutor? Clearly, data produced by this survey can be very misleading. The graph shown below summarizes the results obtained from part(a) (mathematics) of the above question:
Colombia, Latvia, Slovak and even the Philippines top other countries in terms of taking extra lessons in math. I grew up in the Philippines and I know that this cannot be true. Seventy five percent of students in the Philippines do not attend private tutoring sessions (shadow education).
We must be very careful in drawing conclusions from survey questions. The above is probably not tempting, but in cases where there is a bias, when the data actually show something we would like to see, we may blindly embrace the data even if the numbers are really problematic. International standardized exams and surveys are attractive because these provide a window to see what other countries are doing. One example is a recent post by Jo Boaler at Stanford University's YouCubed site. Professor Boaler is criticizing the EngageNewYork curriculum:
Math ‘Fluency’ and the Curriculum
The first two minutes of a sample lesson in 2nd grade from EngageNewYork is shown below:
Unfortunately, Boaler is using PISA data to support her assertion. Specifically, she states, "Their data from 13 million 15-year olds across the world show that the lowest achieving students are those who focus on memorization and who believe that memorizing is important when studying for mathematics." The top performing students in PISA mathematics come from Shanghai, Singapore, Hong Kong, Taiwan, South Korea, Macao, and Japan. To appreciate why drawing conclusions from PISA data is quite dangerous, one must consider the following from a previous post in this blog, "Copying from Educational Systems Abroad":
Shadow education is very pervasive especially in Asian countries that have done well recently in international exams, as illustrated in a study published recently in the Journal of International and Comparative Education:
The scope of shadow education in Asian countries is vast, capturing the majority of all students. The statistics above must give pause to anyone claiming that the success in schools in these countries may be emulated by simply copying their curriculum and pedagogy. That would not be accurate at all if shadow education, which is significant, is ignored. Ignoring shadow education means dismissing rote approaches to learning. That would be a huge mistake....
The above data on shadow education are much more reliable than those from the international exams which treated the issue of tutoring not in a very thoughtful manner. There is a great deal of shadow education especially in the top performing countries of Asia. It is not a secret that shadow education in these societies generally practices rote learning. Without a complete picture of how students are taught in these countries, it is easy to draw wrong conclusions.
At Georgetown, each and every exam I give to students provides on the first page a list of equations, constants and conversions that they may need. Some of the assistants who help me teach the course often wonder why I give that information in the exam. These assistants obtained their undergraduate degrees from China and their teachers have always required them to memorize. I wonder as well....