Philippines DepEd Failed to GRASP - K to 12 Grading System

One of the most viewed pages in this blog is "DepEd's K to 12 New Grading System". My guess on why this page is quite popular is because most are confused with regard to DepEd's "new grades". The confusion is quite understandable. A colleague of mine got a headache just trying to read the memorandum (DepEd Order No. 73 s. 2012) that tells teachers how to assess learning outcomes. First, it defines learning outcomes in terms of levels. Right at the very beginning, DepEd seems to be confused as well, not knowing the difference between assessments and outcomes. Assessments are simply our attempt to measure learning outcomes. Before I show you excerpts from the DepEd memo, I will use several slides from the Georgia Department of Education in the US that do a much better job explaining what "performance assessment" really is.



The presentation starts with a clear description of what assessment entails. It shows, for example, various assessment strategies. These are not called "levels". Using the word "level" suggests a hierarchy. 

Contrast the above with the DepEd memo:


Right at the beginning, one can sense that whoever wrote the DepEd's memo does not really understand what "knowledge" means. All types of assessment measure knowledge (except those that specifically measure skills, but even those require information and can be affected by knowledge - for example, reading comprehension is a skill but it requires vocabulary, which is knowledge). The list above therefore clearly shows that DepEd misunderstands assessments. If DepEd itself does not comprehend what assessment is, how much more difficult would it be for a classroom teacher in the Philippines to understand what he or she is being asked by DepEd to do. The DepEd memo then describes what it believes as the "highest level" of assessment:


And DepEd specifically cites "GRASPS" (introduced by McTighe and Wiggins) as a good model for this "level" (should be "type") of assessment. Slides from the Georgia Department of Education have the following on GRASPS:

And an example is provided for second grade math. The lesson covered by this task is counting, constructing a table, and drawing a graph to present the counting results.

And here is the product one might expect from the students:



Of course, the DepEd memo comes with its own example. This one is for a higher grade, Grade 7, and it is on science. Unfortunately, it is on chemistry, so this one really gave me and my colleague a big headache.


And here is the example of a "performance task".


The above is clearly inappropriate for assessing how much students have learned in chemistry regarding solutions and concentrations. The project described touches so many topics outside of Grade 7 solution chemistry. The criteria are really tangential to concepts such as molarity, molality, percent composition, and solubility. This is not an assessment in chemistry. This is supposed to be the "highest level" yet it fails miserably in measuring how much a student learns in a chemistry class.

There are good assessments and there are bad assessments. And most of the time, it really depends on who wrote the assessment. Some are brilliant while some are simply stupid. Standardized exams are assessment tools as well and these are not necessarily stupid. PISA is one example of a good standardized exam (this is discussed in a previous post on this blog). It is in a "multiple-choice" format and below is a sample question:




The quality of assessments largely depends on how these are constructed. There are excellent "multiple choice" exams and of course, there are poor "performance tasks" (DepEd's sample above is one). Coming from DepEd means a lot of teachers are getting misinformed, which of course leads to poor learning outcomes in schools in the Philippines.








Comments