Denise Fergus says current laws do not protect families from AI abuse.
Category
🗞
NewsTranscript
00:00Denise Fergus, whose two-year-old son James Balger was murdered in 1993, says AI-generated
00:08videos showing a digital version of her son talking about his death are deeply traumatising.
00:14She believes the current protections under the Online Safety Act do not go far enough
00:19across Merseyside and indeed the UK. Platforms like TikTok, YouTube and Instagram are under
00:26growing pressure to act swiftly on harmful content, particularly when it comes to real-life
00:32crimes. Campaigners argue that existing laws lack clarity and enforcement power. Ofcom is now
00:39assessing how well platforms are complying with the new rules, but the regulator cannot force them
00:45to remove specific content. AI-generated avatars imitating real people, especially children,
00:53are raising urgent legal and ethical questions. While such material may breach UK law, enforcement
01:00often depends on whether platforms take decisive, timely action. Critics say that's still too patchy
01:08and reactive. Experts have warned there's a serious gap in how synthetic media is defined
01:14and handled. The Online Safety Act targets harmful material broadly, but doesn't mention AI-generated
01:21content. Campaigners say that until those definitions are set in law, families like Denise
01:27Ferguson's risk being traumatised every time they go online. Now the government says it is committed
01:34to protecting users, but victims' families are calling for stronger, more specific legislation
01:40to reflect the harms new technology can inflict.