Some claim Japan is “living in the future.” As anyone who lives here and has had to deal with everything from paper forms to fax machines to Internet Explorer knows, that’s a bit far-fetched.
However, Japanese companies and even some government agencies have jumped – for better or worse – on the Artificial Intelligence (AI) bandwagon in a more or less timely fashion. Unfortunately, not all of these attempts to “do more with less” are panning out. That could put children – and others – at risk.
62% failure rate

According to Yomiuri Shimbun, Japan’s Children & Families Agency spent 1 billion yen (USD $6, 684,000) to build what Yomuri describes as an AI-powered system to detect child abuse. (Based on its description, the system sounds more like a “traditional AI” system – i.e., a Machine Learning system – meant to recognize patterns in data, as opposed to a Generative AI system that creates new media.)
The system wasn’t intended to replace humans. Instead, it was meant to help experts who were caring for kids taken into temporary custody determine whether it was unsafe to let them return to their parents.
The Agency trained the system on 5,000 cases of confirmed child abuse. Based on information provided on new cases, the system was supposed to judge the likelihood of abuse based on 91 data points, including injuries found on the child and the parents’ attitude, using a scoring system of 0 to 100.
Unfortunately for the government, the system didn’t pass muster. A team ran the AI system on 100 different cases where investigators determined a risk of abuse – only for the system to flag 62 of those cases as “substantially low risk.”
Digging into the details, one of the issues is that the system seemed to discount testimony in the absence of physical evidence. For example, one child said their mom repeatedly bashed their head onto the floor, beating them “half to death.” The system gave this case a score of 2 to 3 out of 100.
Experts told the Agency that it would be difficult for the system to compute a high reliability score given the different forms that abuse can take. They also said 5,000 data points weren’t nearly enough to train the system. adequately. The Agency itself also admitted that it wasn’t set up to properly capture certain data points, such as a drop in a child’s weight or injuries, that serve as signs of abuse.
Based on this feedback and other difficulties, the Agency announced it would abandon the project. Instead of helping kids, the system will serve as an expensive one billion yen lesson.
Planning a trip to Japan? Get an authentic, interpreted experience from Unseen Japan Tours and see a side of the country others miss!

"Noah [at Unseen Japan] put together an itinerary that didn’t lock us in and we could travel at our own pace. In Tokyo, he guided us personally on a walking tour. Overall, he made our Japan trip an experience not to forget." - Kate and Simon S., Australia

See a side of Tokyo that other tourists can't. Book a tour with Unseen Japan Tours - we'll tailor your trip to your interests and guide you through experiences usually closed off to non-Japanese speakers.


Want more news and views from Japan? Donate $5/month ($60 one-time donation) to the Unseen Japan Journalism Fund to join Unseen Japan Insider. You'll get our Insider newsletter with more news and deep dives, a chance to get your burning Japan questions answered, and a voice in our future editorial direction.
Case highlights the reliability risks of AI and Machine Learning
Japan is wrestling with labor shortages in every industry as its population ages and declines. Child protection authorities have also wrestled with charges from victim’s advocates that Japan too often puts children in danger by caving in to parental demands to return kids to dangerous environments.
The advent of Generative AI via Large Language Models (LLMs) such as OpenAI has companies scrambling to add “AI” features to their software systems – whether they need them or not. The Children & Families Agency’s use case was a potential useful application that could have assisted investigators.
However, it encountered a common problem with all Machine Learning & AI use cases: a lack of high-volume, high-quality data. Reports from the tech industry consistently show data quality is a significant blocker to Machine Learning and GenAI initiatives.
This means that Japan, for now, will need to find some other way to supplement its workforce to support essential services. The problem isn’t limited to child services, either. The city of Fukuoka says some people are experiencing two-hour waits to talk to someone on its suicide hotline. The line is staffed by volunteers, 80% of whom are senior citizens.
Why this page doesn't look like crap
You may notice a few things about this page. First, it’s mostly content – not ads. Second, this article was written by a human, not a plagiaristic Turing machine.
Unseen Japan is a collective of independent authors. We work hard to keep our content free of intrusive ads and AI slop.
Help us keep it that way. Donate to the Unseen Japan Journalism Fund to support our work. Regular donors will receive Insider, our paid newsletter with weekly bonus content about Japan. Plus, your contribution will help us produce more content like this.
What to read next

Japan Is Betting That Tourists Want…NFTs?!
The world has moved on from the NFT scam. Japan, however, thinks it’s found a nifty new tool to promote regional tourism.

How AI Is Warping Everyone’s Perceptions of Japan
Propagandists have always spread disinformation about Japan. AI has made it worse—and, in some cases, may have put people’s lives in danger.

Japanese Gamer Skips Middle School to Work on Fortnite Skills
It’s a rare move in a country with no legal structure for homeschooling – but the gamer and his parents stand by the family’s decision.