Heuristics for Test Question Generation

[textile]
Today at the Workshop on Open Certification we came up with the following (non-ordered) heuristics that might be useful in test question creation:

1) Plausible buzzwords
2) True but irrelevant
3) Write but for the wrong reason
4) Some fool said it
5) My boss will believe it
6) Two conclusions from the same reason
7) Incomplete reason
8) More detail is typical in the correct answer
9) Confusion test techniques
10) Incorrect application of technique
11) Formally phrased answers
12) Read learning objectives first
13) Variations of the theme to make it more challenging
14) Any time you feel the need to mention a source, then try to reword so we do not need to mention the source
15) Invert the cause and effect
16) Avoid inappropriate or confusion humor
17) The correct tends to be similar to the incorrect answer

There is no context for this list. This is more for distribution purposes for attendees. Others will post details around this later.

The attendees included:

  • Scott Barber
  • Tim Coulter
  • Zach Fisher
  • Dawn Haynes
  • Doug Hoffman
  • Andy Hohenner
  • Paul Holland
  • Kathy Iberle
  • Karen Johnson
  • Michael Kelly
  • Phil Kos
  • Baher Malek
  • Ben Simo

[/textile]

Comments

So these are heuristics for bad test questions? Or is the point being made that all test questions are bad? If these are for generating bad questions, is there a list of heuristics for generating good ones? I wanted to be at the workshop but could not attend, so this is interesting to me.

John,

The intent is to use the heuristics to create effective distractors for each question. Some of them are useful in designing good distractors; some are good for identifying bad distractors.

-Mike