Japan Cancels Billion-Yen AI Child Abuse Detection System After Failed Tests

An AI-ish style white HUD-like display superimposed on a picture of a young Asian girl in profile looking to the right
Picture: metamorworks / PIXTA(ピクスタ)
The Children & Families Agency says it's giving up on the system after experts determined the agency didn't have enough data.

Don’t miss a thing – get our free newsletter

Some claim Japan is “living in the future.” As anyone who lives here and has had to deal with everything from paper forms to fax machines to Internet Explorer knows, that’s a bit far-fetched.

However, Japanese companies and even some government agencies have jumped – for better or worse – on the Artificial Intelligence (AI) bandwagon in a more or less timely fashion. Unfortunately, not all of these attempts to “do more with less” are panning out. That could put children – and others – at risk.

62% failure rate

Child in silhouette looking despondent
Picture: rito / PIXTA(ピクスタ)

According to Yomiuri Shimbun, Japan’s Children & Families Agency spent 1 billion yen (USD $6, 684,000) to build what Yomuri describes as an AI-powered system to detect child abuse. (Based on its description, the system sounds more like a “traditional AI” system – i.e., a Machine Learning system – meant to recognize patterns in data, as opposed to a Generative AI system that creates new media.)

The system wasn’t intended to replace humans. Instead, it was meant to help experts who were caring for kids taken into temporary custody determine whether it was unsafe to let them return to their parents.

The Agency trained the system on 5,000 cases of confirmed child abuse. Based on information provided on new cases, the system was supposed to judge the likelihood of abuse based on 91 data points, including injuries found on the child and the parents’ attitude, using a scoring system of 0 to 100.

Unfortunately for the government, the system didn’t pass muster. A team ran the AI system on 100 different cases where investigators determined a risk of abuse – only for the system to flag 62 of those cases as “substantially low risk.”

Digging into the details, one of the issues is that the system seemed to discount testimony in the absence of physical evidence. For example, one child said their mom repeatedly bashed their head onto the floor, beating them “half to death.” The system gave this case a score of 2 to 3 out of 100.

Experts told the Agency that it would be difficult for the system to compute a high reliability score given the different forms that abuse can take. They also said 5,000 data points weren’t nearly enough to train the system. adequately. The Agency itself also admitted that it wasn’t set up to properly capture certain data points, such as a drop in a child’s weight or injuries, that serve as signs of abuse.

Based on this feedback and other difficulties, the Agency announced it would abandon the project. Instead of helping kids, the system will serve as an expensive one billion yen lesson.

Case highlights the reliability risks of AI and Machine Learning

Japan is wrestling with labor shortages in every industry as its population ages and declines. Child protection authorities have also wrestled with charges from victim’s advocates that Japan too often puts children in danger by caving in to parental demands to return kids to dangerous environments.

The advent of Generative AI via Large Language Models (LLMs) such as OpenAI has companies scrambling to add “AI” features to their software systems – whether they need them or not. The Children & Families Agency’s use case was a potential useful application that could have assisted investigators.

However, it encountered a common problem with all Machine Learning & AI use cases: a lack of high-volume, high-quality data. Reports from the tech industry consistently show data quality is a significant blocker to Machine Learning and GenAI initiatives.

This means that Japan, for now, will need to find some other way to supplement its workforce to support essential services. The problem isn’t limited to child services, either. The city of Fukuoka says some people are experiencing two-hour waits to talk to someone on its suicide hotline. The line is staffed by volunteers, 80% of whom are senior citizens.

Why this page doesn't look like crap

You may notice a few things about this page. First, it’s mostly content – not ads. Second, this article was written by a human, not a plagiaristic Turing machine.

Unseen Japan is a collective of independent authors. We work hard to keep our content free of intrusive ads and AI slop. 

Help us keep it that way. Donate to the Unseen Japan Journalism Fund to support our work. Regular donors will receive Insider, our paid newsletter with weekly bonus content about Japan. Plus, your contribution will help us produce more content like this.

What to read next

Don’t miss a thing – get our free newsletter

Before You Go...

Let’s stay in touch. Get our free newsletter to get a weekly update on our best stories (all human-generated, we promise). You’ll also help keep UJ independent of Google and the social media giants.

Want a preview? Read our archives.

Read our privacy policy