Does Conjoint Analysis Mitigate Social Desirability Bias?
How can we elicit truthful responses in surveys? Political scientists are often concerned about systematic survey misreporting on sensitive topics, known as social desirability bias (SDB). Conjoint analysis has become a popular tool to address this concern, despite the lack of systematic evidence showing its suitability for this purpose. We employ a novel experimental design to identify a fully randomized conjoint design's ability to mitigate SDB. Specifically, we compare a standard conjoint design against a partially randomized design where only the sensitive attribute is varied between two profiles in each task. Our design also includes a placebo condition designed to remove confounding due to the increased attention to the varying attribute under the partially randomized design. We implemented this experiment in a survey to uncover respondents' attitudes toward environmental conservation. We find suggestive evidence that conjoint analysis does, indeed, mitigate SDB