Wednesday, January 9, 2013

Student Evaluation, Another Analogy

Two of my previous three posts have been about some of the still controversial, among both teachers and the public, pushes for reform of how student marks are calculated.  One post was about not assigning zeroes for work that was not done and the other was about not using averages to calculate student grades.

Today I thought that I would try to combine the ideas of those two posts using an analogy that I hope all Canadians will understand, hockey.

Determining a student's final grade is a little bit like the General Manager of an NHL hockey team trying to decide how good a player is.  If the GM is looking at a player who has scored 25 goals in each of the last four seasons, then the GM can be pretty confident that he is signing a player who will score around 25 goals next year.  In the same way If I see a student whose marks are 65, 65, 65, 65 I can be confident in assigning a mark of 65.

The difficulty arrives with players and students who are not so consistent.  What about a player with scoring stats of 20, 20, 25, 30?  I think most of us would guess that player should be scoring goals in the high-20, low-30 range next year.  My point is that we are not looking at his average of goals, which is 23.75, but rather at the trend of his performance, and expecting that he actually rates better than his average.  As a teacher I am trying to do the same thing for students.  A student with marks of 60, 60, 65, 70 should probably be rated in the high 60s even though the average is 63.75.

We also need to reflect negative trends.  A hockey player with 30, 30, 25, 20 has averaged 26.25 goals per season but I doubt the GM should pay him like a 25+ goal player.  Similarly a student with 70, 70, 65, 60 has an average of 66.75 but probably should be evaluated in the low 60s.

What about flukey results?  Imagine a hockey player with 20, 20, 40, 20 over the last four seasons.  His average is 25 goals a season but that 40 goal season looks like it was more of a fluke than anything else.  As a GM I would probably pay the player like a 20 goal guy.  The issue is the same for a student with only a few good results.  A student with marks of 60, 60, 100, 60 has an average of 70 but probably should only be given a mark in the low 60s since that one 100 does not seem at all to be representative of his ability.

Let us also look at zeroes.  Imagine a hockey player who scored 30 goals a year for three seasons then was suspended for all of his fourth year for drug infractions.  His goals are 30,30,30,0 and he is averaging 22.5 goals a season.  Some people might say "well, it is the players fault he got suspended and scored no goals, he should have to face the consequences" and suggest he be paid like a low-20s goal scorer.  However, I bet most NHL GMs would be willing to pay that player based on him scoring almost 30 goals in the following season.   That is because the previous consistency is very strong evidence of the player's underlying ability.  Also, does anyone really think the player would have scored no goals if he had played last year?   So if I look at a student whose marks are 70, 70, 70, 0 (did not hand in the project) the average is 52.5.  But the initial consistency suggests that this student's learning is near 70.  So we should give a mark in the high 60s that reflects our best estimate of the student's learning, not an average that suggests the student is barely passing.

I hope this analogy has helped you understand that teachers who are trying to follow current best practices are trying to estimate underlying learning based on consistency and trends, rather like we might try to evaluate hockey players.