37 / 663

It’s not easy to get depression-detecting AI through the FDA

TL;DR

California-based startup Kintsugi spent seven years building AI that detects signs of depression and anxiety from how someone speaks – not what they say, but vocal patterns.

Key Points

  • After failing to secure FDA clearance in time, the company is shutting down and open-sourcing most of its technology.
  • Some components may find new life outside healthcare, including deepfake audio detection.
  • Mental health diagnosis still relies largely on questionnaires and clinical interviews rather than objective lab-style tests.

Nauti's Take

Seven years of work, a genuine clinical gap identified – and it ends on regulatory timing. This isn't a technology failure; it's a structural one.

Bringing clinical AI to market requires not just good models but deep pockets and patience for FDA processes that most startups simply can't sustain. The open-source decision deserves credit – too often these efforts vanish without a trace.

The potential pivot to deepfake detection is also a reminder that solid foundational research tends to find its uses, even when the original application doesn't make it.

Sources