• 2 months ago
Experts say AI bots create falsehoods by using word prediction. These AI platforms can generate wild inaccuracies – known as 'hallucinations'.

Category

📺
TV
Transcript
00:00Well, Martin Bernklau is a journalist who lives in the city of Tübingen, which is in
00:07the south of Germany, and for decades he's been reporting on the local courts.
00:12About a year ago, he pivoted his journalism into arts reporting, and he was curious to
00:18know how his arts reviews were tracking.
00:21Now his Microsoft system suggested that he use Copilot to do the search, which is the
00:30AI platform attached to Bing, which is the Microsoft search platform.
00:36So he starts dialoguing with Copilot, and he gets a huge shock, because Copilot tells
00:44him that Martin Bernklau is a convicted child abuser, has been convicted of defrauding the
00:50elderly, has escaped from a psychiatric institution, and is guilty of many, many other crimes.
00:59Now what appears to have happened is that these are all crimes and court cases that
01:04over the years Martin Bernklau had reported on, and what had happened was that when the
01:11AI platform scraped the internet, it came to total wrong conclusions or hallucinations
01:17about who Martin Bernklau is.
01:20And just another detail, not only did it say this is who he is, it also gave his telephone
01:26number, his address, and a route map to his house.
01:29My goodness, so what did he do when he discovered these hallucinations?
01:34Well he took a couple of steps.
01:36He approached the prosecutor in his local town.
01:39I don't think that step led to anything happening.
01:43He also approached the German Office of Data Protection, and that office contacted Microsoft's
01:51European HQ in Dublin, and they said they would look at it.
01:56But subsequent to that contact, similar searches resulted in the same kind of slanders or inaccuracies,
02:04according to Martin.
02:06But Martin tells me just in recent weeks, when you type his name into this platform,
02:12nothing comes up at all.
02:13And it would appear that what has happened is, according to Martin, is that it can't
02:18deal with the inaccuracies, so it's just kind of scraped him from the system totally and
02:23won't respond to any queries about him.
02:25So have there been legal cases arising from these types of AI hallucinations, Damian?
02:30Well Martin certainly is considering it, but when he talked to a lawyer in Germany, the
02:37lawyer said, look, we don't know how long it would take to resolve, whether you'd get
02:43a good outcome, and it would be very expensive.
02:46Have there been other cases?
02:47Well here in Australia, just earlier this year, what was going to be the big probably
02:54world test case was dropped.
02:56That was being brought by the mayor of the Hepburn Shire Council in regional Victoria.
03:02His name is Brian Hood.
03:05And he dropped what was believed to be the first test case in the world because it was
03:10too expensive.
03:11Now what had happened to Brian Hood was he typed his name into ChatGPT, and it described
03:16him as a convicted criminal, a fraudster.
03:19In fact, Brian Hood is the polar opposite.
03:22He is a very highly respected whistleblower who came forward when he realised that there
03:27was something wildly amiss at a subsidiary of the Reserve Bank of Australia.
03:32And he's widely respected for having brought that criminal behaviour to light.
03:38So he was very concerned, but he decided that because of the, largely because of the cost
03:43associated with uncertain litigation, he decided to drop that test case.
03:48Are there any legal actions happening elsewhere in the world?
03:52Well, the one that is going ahead is perhaps, not surprisingly, in the more litigious United
03:58States.
03:59And that involves a radio and podcast talk show called Mark Walters.
04:04And he's suing OpenAI, the company that owns ChatGPT.
04:08Now Walters hosts a program called Armed American Radio, which discusses and promotes gun ownership
04:16in the USA.
04:18And ChatGPT had hallucinated that he was actually being sued by another organisation or an organisation
04:25called the Second Amendment Foundation, which is a US-based organisation that also supports
04:30gun rights.
04:32Somehow ChatGPT had hallucinated that he was being sued for defrauding this organisation,
04:38when in fact he'd had nothing to do with it, he'd never worked for it.
04:42And this was just a total hallucination.
04:45ChatGPT has tried to dismiss this action, but it is going through the courts.
04:49So it will be very interesting to see what happens.
04:52And it may prove to be the test case that we're all waiting for around the world.

Recommended