Grades 6-10

The Podgon Game: Assessment Rubric

What this is for

This rubric is meant to help the teacher understand how students are thinking during the Podgon Game. It is not designed for grading. It can be applied to the Student Handout, to observations during class discussion, or to the student-led round in the Extension Pack.

There are four things worth paying attention to, and I have tried to describe what each looks like at different levels of sophistication. The levels are rough — students will often be at different levels on different dimensions, and a given student might be at one level in one moment and another level a few minutes later.


1. Quality of proposed definitions

The question here is: can the student propose a definition that is consistent with the available evidence?

At the most basic level, a student might give a definition that is too vague to be useful ("a podgon is a shape") or that contradicts examples already on the board ("a podgon is a triangle" — after a pentagon has been confirmed). This is fine early in the session, especially if they are still figuring out the task.

A step up is a definition that is consistent with the current examples but is either too broad or too narrow in ways the student has not noticed. For example, "a closed shape with straight sides" might work early on but will eventually be contradicted.

At a more sophisticated level, the student's definitions are consistent with all known examples and have been refined in response to new evidence. The student can say what changed and why — "we added the odd number of sides part because the hexagon was rejected."

The most impressive thing to see is a student who proposes multiple competing definitions and can identify what would distinguish between them. This shows an understanding that more than one definition can be consistent with the same evidence — which is one of the central ideas of the activity.


2. Quality of test cases

The question here is: can the student invent a shape that is strategically useful for evaluating a definition?

At the most basic level, examples are random or repeat information already on the board. The student cannot explain why they are asking about a particular shape.

A step up is when examples are relevant to the definitions under consideration, but the student's stated reason does not match the example's actual value. This came up in my implementation: a student asked about a square "to prove podgons have equal sides," but the square also differs in number of sides, so it cannot isolate the equal-sides question.

More sophisticated is when examples are chosen to test one specific feature of a definition. The student can state what they expect to learn: "If this shape IS a podgon, then the definition needs to allow unequal sides. If it is NOT, the equal-sides part holds."

The most sophisticated thing to see is examples designed to distinguish between two competing definitions. The student identifies the minimal difference between the example and known cases, isolating exactly one variable.

What to look for in the Student Handout

The "Why are we asking" and "If yes / If no" fields are the most diagnostic parts of the handout.

  • If the "Why" field is blank, the student may be guessing without a strategy. Worth following up in conversation.
  • If the "Why" field is filled in but the "If yes / If no" fields are blank, the student has an intuition but has not formalised it. This is a normal intermediate stage.
  • If all three are filled in and logically consistent, the student is reasoning strategically.
  • If "If yes" and "If no" lead to the same conclusion, the example is useless for distinguishing anything. This is a good teaching moment — "If you learn the same thing regardless of the answer, why ask?"

3. Reasoning about evidence

The question here is: does the student draw appropriate conclusions from the evidence?

At the most basic level, the student generalises from a single example ("one open shape is not a podgon, therefore all podgons are closed") or fails to update a definition when a counterexample is presented.

A step up is when the student correctly rejects a definition given a clear counterexample but may over-generalise from confirming examples. In my experience, students are generally better with counterexamples than with confirming evidence — I could not find even one instance of a student who persisted with a definition after seeing a clear counterexample, though there were many cases of students treating a single confirming example as proof.

More sophisticated is when the student distinguishes between "this example rules out definition X" and "this example is consistent with definition X but does not prove it." This is a subtle distinction and most students will not get there in a single session.

The most sophisticated level is when a student explicitly reasons about what classes of examples have been tested versus not tested — for example, "we have only tested shapes with straight sides, so we do not know what happens with curves."

Common errors worth discussing

These are not marks against the student. They are teaching opportunities:

"This proves the definition." A confirming example does not prove anything. Ask: "Can you think of a shape that would fit your definition but might NOT be a podgon?"

"Since this one does not work, none of them work." One rejected open shape does not mean all open shapes fail. Ask: "What about a different open shape?"

"A and B both said yes, so they have the same definition." Agreement on one example does not mean agreement on all. Ask: "Can you find a shape where they might disagree?"

Not updating after a counterexample. If a student proposes the same definition that was just refuted, they may not have registered the counterexample. Revisit it explicitly.


4. Communication and precision

The question here is: can the student articulate their definition and reasoning clearly?

At the most basic level, definitions use vague language ("a shape that looks right") or point to examples rather than giving criteria ("like that triangle").

A step up is when definitions use mathematical language but have unintentional ambiguity. For example, "a shape with equal sides" — does this require ALL sides to be equal, or just SOME? This kind of ambiguity came up in my implementation and is actually quite productive to discuss.

More sophisticated is when definitions are stated precisely enough that another person could apply them unambiguously to any shape. Quantifiers like "all" and "some" are used correctly.

The most sophisticated level is when a student notices and points out ambiguity in other groups' definitions and can suggest how to make them more precise.


Observation sheet

You can use something like the following during the activity to record quick notes on each group. Focus on one or two dimensions per group — you cannot track everything at once.

GroupDefinitionsTest casesReasoningCommunicationNotes
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6

How to use this

During the activity, use the observation sheet for quick notes. After the activity, look through the Student Handouts. The handout gives you a written record of reasoning that you cannot always catch during live discussion. The "Why are we asking" fields and the Reflection questions are the most revealing.

If you run the student-led round from the Extension Pack, the quality of the definitions students create is itself a strong indicator of understanding. A group that creates two definitions with a clear conceptual distinction, that agree on some cases and disagree on others, has demonstrated a deep understanding of what definitions do.

This rubric is for the teacher. I would not share it directly with students. However, you can share the ideas behind it. For instance, after the activity you might say: "A really useful test case is one where you learn something regardless of the answer — whether it is yes or no, you learn something." That makes the key idea from the test-cases dimension explicit without the formality of a rubric.