Gender Gap in Education

Early this month, the OECD published a report on gender gaps or inequalities in basic education. The report, The ABC of Gender Equality in Education: Aptitude, Behaviour, Confidence, explores a variety of data from PISA test scores as well as survey questions to shed light on the differences between boys and girls. The following short video recaps the findings and recommendations from the report:


At first glance, the report sounds a bit of a stretch. How can one possibly decipher gender disparities in basic education from multiple choice questions or simple surveys? Another major concern with regard to PISA test scores is the low stakes nature of the exam. Debeer and coworkers, for example, have questioned whether PISA is really testing persistence and not ability. In the paper, "Student, School, and Country Differences in Sustained Test-Taking Effort in the 2009 PISA Reading Assessment", it is found first of all that there is a decrease in examinee effort while taking the PISA exam. This decrease in effort manifests in a greater probability of incorrect responses as one moves a question to a later part of the test.

PISA 2012 adds the following question at the end of the exam:

Above copied from The ABC of Gender Equality in Education
The results from this survey question are as follows:
On a scale ranging from 1 to 10, where 1 represents minimum effort and 10 maximum effort, girls reported an effort of 7.67 in the low-stakes PISA test while boys reported an effort of 7.32, on average across OECD countries. Girls reported an effort of 9.36 in the hypothetical high-stakes PISA test while boys reported an effort of 9.13, on average. 
Self-report measures on effort unfortunately are oftentimes inaccurate. First and foremost, what does the scale of 1 to 10 really mean? What does 7.67 on this scale mean? What does the difference between 9.36 and 7.67 on this scale mean? I have no idea.

In Debeer's work, on the other hand, measuring the probability of a correct response on a given question as a function of where that question is placed on an exam is much more reliable. Among students from Greece, for example, simply placing a question near the end of the exam results in a decrease of about 20 percent in correct responses. This clearly shows a decrease in effort during the exam itself. If gender differences are being extracted from the test scores, it is therefore important to see if this measurable decrease in effort correlates with gender as well.

Nonetheless, there are quite interesting trends obtained from the study. One is the use of video games. According to the survey, boys tend to play video games more than girls do. Interestingly, playing video games correlates with better performance if the games are not collaborative online:

Above copied from The ABC of Gender Equality in Education
Of course, the above are mere correlations. Another finding relates to the number of hours spent on homework. Right at the beginning, this is already problematic since it merely asks for hours spent on homework. A child can easily spend three hours on a homework that only requires thirty minutes. The mere number of hours spent on homework can not possibly determine how much a child is learning. A mere correlation between hours spent on homework and test scores is not a clear indication of causation. It is already known that the number of hours spent on homework do not correlate with a school system's performance on the PISA exam. Within a school system (or country), a correlation may exist but one must take caution in these trends. One simply has to notice that the amount of time spent on homework correlates with socio-economic status. Thus, socio-economic factors are therefore in play.

There is certainly useful information one can can derive from standardized exams or surveys. Unfortunately, there are limits on how these pieces of data can be analyzed. One must keep this in mind especially when addressing a very complicated and difficult issue like gender differences in education.






Comments