The Credence Calibration game
- Browse screenshots
- Download for Windows
- Download for Mac OS X
- Download for Android
- Read more about the game
- Read more about information scoring
When you hear someone say “I’m 90% sure…”, that “90%” figure is called a credence level. Most people tend to be over-confident by default, in the sense that when we say “90% sure”, we tend to be right much less than 90% of the time.
You or your friends may have noticed this consciously or unconsciously over time, and that might even make you want to avoid stating credence levels at all. But at CFAR, we strive to calibrate our reported credences so that they more accurately reflect our actual success rates. Being well-calibrated is valuable, because it means you’re able to tell people something predictable and accurate about how much to trust you. It can even be seen as a kind of honesty: if you act like you know something, but you don’t, it can mislead people, and calibration can help you avoid doing that either accidentally or unconsciously.
Ideally, we’d track predictions about all the most important things in our lives and the world, and train ourselves that way. But rapid feedback also has its advantages, and for that, we’ve made the Credence Calibration game!
In spring 2012, as a project for CFAR, Alexei Andreev, Zachary Alethia, and CFAR cofounder Andrew Critch created this simple game for training people to adjust their reported credence levels to more closely reflect their success rates on answering questions. It’s a very crude implementation of the concept, but has been used in CFAR’s workshops, and by Professors Saul Perlmutter, John Campbell and Robert MacCoun in their course, “Sense, Sensibility, and Science.”
(If you’d like to develop an update to the game, please contact Andrew Critch, firstname.lastname@example.org)