How should you award points for Certainty Based Marking?

success-vector-id470848827.jpgWe recently wrote an article for eLearning Industries that outlined three ways in which Certainty Based Marking (CBM) improves employee assessment. CBM expert Tony Gardner-Medwin from University College London responded to the article with an interesting and very valid comment, pointing out that the points scale shown in our article isn’t necessarily ideal for CBM to be effective. We agree with Tony, so we thought we’d pen a piece explaining how Wranx awards points for CBM!

 

The points scale shown in our article is as follows:

  • Correct answer, high certainty – 20 points.
  • Correct answer, medium certainty – 15 points.
  • Correct answer, low certainty – 12 points.
  • Incorrect answer, low certainty – 8 points.
  • Incorrect answer, medium certainty – 5 points.
  • Incorrect answer, high certainty – 0 points.

Indeed, this is also the points scale we use as an example in our blog post: “Certainty Based Marking and how it improves employee training.” We used it mainly as a simplified way of showing the staggered way in which points are awarded in CBM, while trying to avoid writing too much about the different approaches to scoring that can be employed. Tony correctly pointed out, however, three potential shortcomings of this marking approach:

  1. “If you look at the optimal strategy to maximise your expected score on this scheme, you should claim “high” certainty whenever you think there is a >50% probability of being correct. This is scarcely a satisfactory criterion for reliable knowledge.
  2. “With no knowledge at all, you will get at least 40% of the maximum possible score, even with entirely incorrect guesses. Though of course final scores can be adjusted so 40% becomes zero, the logic must seem very odd from a trainee’s standpoint.
  3. “Either Wranx assigns zero for an omitted answer (a usual convention), which treats it bizarrely like a highly confident misconception, or it may credit a free 8 points (as for an incorrect guess). Either strategy seems pretty weird!”

We do actually use the above marking scheme for the learning Drills on the Wranx platform. It provides a clear and understandable means for learners to familiarise themselves with our CBM functionality and its graded marking scale while they are building knowledge and confidence in a non-assessed environment.

For assessments that are used to test and benchmark learner knowledge, we use the scoring system that was actually devised by Tony himself, which he explains as follows:

“The CBM scheme that I devised in 1994 and have since then promoted and used with London medical students and within Moodle is different: it awards 1, 2 or 3 points for correct answers (at increasing certainty) and 0, -2 or -6 points for incorrect answers. Optimal criteria for P(correct) are easily seen to be <67%, 67-80%, >80%.”

In Tony’s words, the benefits of using this scale to score assessments are that:

“Zero is the natural point on a scale for a student who acknowledges total ignorance about a proper answer, while confidence in definitely wrong answers is without doubt a worse, potentially dangerous, situation deserving of negative points.”

You could configure the points scale in any number of ways, of course, but we think that, for assessment, this is the best approach for ensuring that learners report their certainty honestly and for giving a good reflection of learner knowledge across a workforce.

Of course, there’s only value in any points scale if the possible answers are all feasible possibilities. If the four options for a question about the capital of France are Paris, New York, Fish and Football, then it’s hardly an assessment! It’s important, therefore, that assessment content is written by experts.

Point scales are only one part of what can make Certainty Based Marking effective as a learning approach. When it’s used in organisations, the whole organisation must have a consistent approach on number of key points. We suggest that learners should each do two CBM assessments with a core of around 20 questions and a slightly longer follow up to cover any given topic in full.

Written by: Persia Shahkarami

Persia is passionate about helping organizations improve employee engagement across their professional development initiatives.
Published: 15 Jun, 2022