AI programs are like sponges – they soak up information and learn from it. But if they learn from biased data, the AI can also become biased.
ReadSteven Schwartz, a lawyer at the law firm Levidow, Levidow & Oberman, relied on ChatGPT to help draft a court brief for a personal injury case. In the process, the AI model created six non-existent court decisions, which Schwartz, unaware of the potential for error in AI-produced content, unwittingly included in his brief.
Read