Testing Multiple Forecasters
We consider a cross-calibration test of predictions by multiple potential experts in a stochastic environment. This test checks whether each expert is calibrated conditional on the predictions made by other experts. We show that this test is good in the sense that a true expert--one informed of the true distribution of the process--is guaranteed to pass the test no matter what the other potential experts do, and false experts will fail the test on all but a small (category one) set of true distributions. Furthermore, even when there is no true expert present, a test similar to cross-calibration cannot be simultaneously manipulated by multiple false experts, but at the cost of failing some true experts. In contrast, tests that allow false experts to make precise predictions can be jointly manipulated.
Year of publication: |
2007-01
|
---|---|
Authors: | Feinberg, Yossi ; Stewart, Colin |
Institutions: | Graduate School of Business, Stanford University |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Subjective Reasoning--Games with Unawareness
Feinberg, Yossi, (2004)
-
Games with Incomplete Awareness
Feinberg, Yossi, (2005)
-
A True Expert Knows which Question Should Be Asked
Feinberg, Yossi, (2004)
- More ...