When AI Aids Decisions, When Should Humans Override?
Accuracy Is Not Higher When Explanations Are Provided, Compared to the
The 184 billion market for artificial intelligence shows no signs of slowing down. In theory, a human collaborating with an AI system should make better decisions than either working alone. But a recent study suggests that accuracy is not higher when explanations are provided, compared to when they are not.
This has implications for the future of AI and human decision-making.
As AI becomes more sophisticated, it will be increasingly used to make important decisions. In some cases, AI may be more accurate than humans at making these decisions. However, the study's findings suggest that humans should not blindly trust AI. They should be prepared to override AI's decisions when they believe that AI is making a mistake.
The study was conducted by researchers at the University of California, Berkeley. The researchers recruited 100 participants to participate in a series of experiments. In each experiment, participants were asked to make a decision about a hypothetical scenario. In some experiments, participants were provided with explanations for AI's decisions. In other experiments, participants were not provided with explanations.
The results showed that participants were no more accurate when they were provided with explanations for AI's decisions. This suggests that humans do not need explanations to understand AI's decisions. However, the researchers did find that participants were more likely to override AI's decisions when they were provided with explanations.
This suggests that explanations may help humans to identify when AI is making a mistake. However, it is important to note that the study did not examine the quality of the explanations. It is possible that the explanations provided in the study were not helpful.
Komentar