Friday, 19 November 2010

Results: Cumulative Marks are Lower

After entering some past marks data and seeing whether cumulative marks really do give lower final marks than running term-by-term marks, as in the last post, the results were quite surprising.

If we run marks the lazy way by just making a gigantic spreadsheet without any regards to weights, i.e. all assignments, tests, etc. are equally weighted, then it turns out that cumulative marks give higher final averages.

But once we start assigning weights to categories and also running cumulative marks, then the shit hits the fan.

So far I have analyzed about 5 sample sets of data, and I would like to run more, but the results are all saying the same thing. A cumulative marks-based system with weighted categories and weighted terms lowers the overall school mark by around 2 - 3% which is statistically significant.

Let's say that the school wants the terms to be weighted as follows:

Term 1 -- 15%
Term 2 -- 20%
Term 3 -- 15%
Term 4 -- 15%
Final Exam -- 30%

Let's say that there are the following categories

Tests -- 60%
Labs -- 20%
Homework -- 20%

A term-by-term spreadsheet would apply those categories to each term, calculate a mark for each term, average the marks according to the weights above, and then calculate the final grade.

A cumulative spreadsheet with weighted terms built in would do something similar, but it would calculate everything based on a percentage of the *final* mark as follows

T1 and T4 Tests -- 9% (60% x 15%)
T1 and T4 Labs -- 3%
T1 and T4 HW -- 3%

T2 and T3 Tests -- 12% (60% x 20%)
T2 and T3 Labs - 4%
T2 and T3 HW - 4%

Final Exam - 30%

A quick check shows this all adds up to 100%

No wonder the administrators always frown on this particular kind of cumulative spreadsheet. Or, in former schools I taught at, they outrightly banned this way of doing marks.

As for why the marks are lowered this way, I have some idea, but it would be too complicated to get into here.

No comments:

Post a Comment