Hey Devs, happy Friday š¦„š
As always, it's been an eventful week in the tech industry. We've got a lot to talk about! And today I want to discuss...the ChatGPT lawsuits š¼
In case you missed it, OpenAI has been hit with its first defamation lawsuit for...hallucinating. Or, as the defendant might put it, generating false and defamatory information about them.
The case states that a journalist, Fred Riehl, asked ChatGPT to summarize a real federal court case by linking to an online PDF. ChatGPT responded by created a false summary of the case that was detailed and convincing but wrong in several regards. ChatGPTās summary contained some factually correct information but also false allegations against Walters. It said Walters was believed to have misappropriated funds from a gun rights non-profit called the Second Amendment Foundation āin excess of $5,000,000.ā Walters has never been accused of this.
In the US, lawsuits are a common method for pushing legislative changes and reform.
So, what do you think will happen here? Share your thoughts in the comments and let's discuss!
Want to submit a question for discussion, or even ask for advice? Visit Sloan's Inbox! You can choose to remain anonymous.
Top comments (6)
I can understand Walters being upset about the false information. But the false info was never published. The reporter did his due diligence and researched more so didn't publish it.
If ChatGpt gives everyone that asks about Walters more information he may have a case. since it could damage his image.ChatGPT needs to be clearer that results are fabricated.
I hope at some point people would realise how AI works; and with that knowledge they would be able to understand that AI always have an answer, even when thereās no basis for it. And thatās what happened here - ChatGPT blurted the most fitting nonsense it came up with because it is not capable of saying āI have no ideaā, unless itās explicitly programmed to do so.
I see what you mean, but I wouldnāt go as far. Intelligence is a lot more sophisticated concept than what we call AI, even with all the recent advancements, I think. While AI is capable of storing memories and making assumptions based of them, it is still missing analytical processing and emotional guidance in comparison to human intelligence.
Though please correct me if I misunderstood your commentš
ChatGPT has a pretty clear disclaimer that it may produce incorrect information. It's like suing Open AI for some incorrect code written by chatGPT. There would be no point in blaming them even if it had damaged the reporter's reputation.
So someone is just playing their game.
Then I agree wholeheartedly, apologies I have misunderstood you originally:)