A Manager and an AI Walk into a Bar : Does ChatGPT Make Biased Decisions Like We Do?
Large language models (LLMs) such as ChatGPT have garnered global attention recently, with a promise to disrupt and revolutionize business operations. As managers rely more on artificial intelligence (AI) technology, there is an urgent need to understand whether there are systematic biases in AI decision-making since they are trained with human data and feedback, and both may be highly biased. This paper tests a broad range of behavioral biases commonly found in humans that are especially relevant to operations management. We found that although ChatGPT can be much less biased and more accurate than humans in problems with explicit mathematical/probabilistic natures, it also exhibits many biases humans possess, especially when the problems are complicated, ambiguous, and implicit. It may suffer from conjunction bias and probability weighting. Its preference can be influenced by framing, the salience of anticipated regret, and the choice of reference. ChatGPT also struggles to process ambiguous information and evaluates risks differently from humans. It may also produce responses similar to heuristics employed by humans, and is prone to confirmation bias. To make these issues worse, ChatGPT is highly overconfident. Our research characterizes ChatGPT's behaviors in decision-making and showcases the need for researchers and businesses to consider potentialAI behavioral biases when developing and employing AI for business operations
Year of publication: |
2023
|
---|---|
Authors: | Chen, Yang ; Andiappan, Meena ; Jenkin, Tracy ; Ovchinnikov, Anton |
Publisher: |
[S.l.] : SSRN |
Saved in:
freely available
Extent: | 1 Online-Ressource (30 p) |
---|---|
Type of publication: | Book / Working Paper |
Language: | English |
Notes: | Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments March 6, 2023 erstellt |
Other identifiers: | 10.2139/ssrn.4380365 [DOI] |
Source: | ECONIS - Online Catalogue of the ZBW |
Persistent link: https://www.econbiz.de/10014259866
Saved in favorites
Similar items by person
-
Jenkin, Tracy, (2023)
-
Jenkin, Tracy, (2024)
-
Chen, Yang, (2019)
- More ...