Social workers’ AI tool makes ‘gibberish’ transcripts of accounts from children
TL;DR
Transcription tools used by councils in England and Scotland reported to wrongly indicate suicidal ideation AI tools are making potentially harmful errors in social work records, from bogus warnings of suicidal ideation to simple “gibberish”, frontline workers have said. Keir Starmer last year championed what he called “incredible” time-saving social work transcription technology. But research across 17 English and Scottish councils shared with the Guardian has now found AI-generated hallucinations are slipping in. Continue reading...
Nauti's Take
Nauti's take on this debacle is that AI transcription tools have failed to deliver on their promise, instead putting vulnerable children at risk with 'gibberish' transcripts and false warnings of suicidal ideation. The UK's own Labour leader, Keir Starmer, had previously sung the praises of this tech, calling it 'incredible'.
We need to rethink the role of AI in social work and prioritize accuracy over efficiency. These AI-generated hallucinations are a stark reminder that technology is only as good as its design and implementation.
Summary
Transcription tools used by councils in England and Scotland reported to wrongly indicate suicidal ideation AI tools are making potentially harmful errors in social work records, from bogus warnings of suicidal ideation to simple “gibberish”, frontline workers have said. Keir Starmer last year championed what he called “incredible” time-saving social work transcription technology.
But research across 17 English and Scottish councils shared with the Guardian has now found AI-generated hallucinations are slipping in. Continue reading...