Clinically Safe Matching  ·  Privacy by Architecture

Where Healing
Finds Its Match

The first therapy matching platform built around clinical safety — not engagement metrics. Your therapist match is determined by outcomes, not algorithms optimised for revenue.

3
Granted Patents
2 sec
Crisis Response
Zero
Privacy Violations
AU
First-of-Kind Innovation
Scroll
The Industry Problem

Therapy platforms optimise for engagement. Not outcomes.

⚠️

5–8% of therapy outcomes are harmful

No existing platform detects harmful therapy patterns before the damage is done. They're optimised for session volume, not clinical safety.

🔓

Privacy violations are industry-wide

BetterHelp's $7.8M FTC fine exposed what practitioners have known: your most sensitive data is being sold. No platform has architecturally prevented this — until LUME.

🔄

Provider switching is trivially easy — by design

Competitors encourage switching because it drives bookings. Therapeutic continuity — the clinical cornerstone of recovery — is deliberately ignored.

$7.8M

The FTC fine that proved every other platform has a fundamental privacy problem by design.

BetterHelp · FTC · 2023
Category-Defining Innovations

Built on three
granted patents

Every core feature is protected by IP that no competitor has come close to filing. LUME is the only platform that treats therapy as a safety problem, not a scheduling problem.

🛡️
Patent Protected
Clinical Safety Preamble

C-SSRS, PHQ-9, K10, GAD-7 clinical thresholds override ALL matching logic when crisis is detected. Crisis resources delivered within 2 seconds — architecturally guaranteed, not best-effort.

🔍
Patent Protected · Exclusive Technology
Harmful Therapy Detection

NLP detects harmful therapy outcome patterns before re-matching. A first-of-kind innovation with no comparable technology in the market.

🌱
Patent Protected
Therapeutic Continuity Engineering

Intentional, clinically-justified friction when switching providers. Counterintuitive for a marketplace — and impossible to replicate without accepting reduced engagement metrics.

🔒
Architecture · Zero Knowledge
Privacy-by-Architecture

Encrypted matching vectors. Session content is architecturally isolated from the matching system. LUME is structurally incapable of the privacy violation that cost BetterHelp $7.8M — not by policy, but by design.

The Process

Three steps to a
clinically safe match

1

Safety-First Assessment

Validated clinical instruments (C-SSRS, PHQ-9, GAD-7) establish your baseline. Crisis detection is live throughout — not a checkbox.

2

Intelligent Matching

Encrypted preference vectors match you to therapists whose outcomes data, specialisations, and cultural competencies align with your needs.

3

Ongoing Protection

Harmful therapy detection runs continuously. Therapeutic continuity engineering keeps your relationship intact. Your progress is protected, not gamified.

Connected Intelligence

LUME doesn't operate in isolation. Through the SINGULARITY engine, its therapeutic intelligence flows across connected platforms — because mental health doesn't exist in a vacuum.

LUSTRE

Beauty and wellness are interconnected

When LUME detects elevated stress signals, a self-care recommendation through LUSTRE gives you something tangible and nurturing — wellness that you can see and feel.

KINSHIP

Family therapy and relationship health

KINSHIP surfaces household conflict signals. LUME connects families with couples counsellors and family therapists who specialise in exactly those dynamics.

WELLSPRING

Financial stress triggers wellness support

Financial stress is the leading predictor of relationship breakdown and anxiety. When WELLSPRING detects financial distress, LUME surfaces financial therapy specialists — not generic counsellors.

Early Access

Be first when
LUME launches

Launching in Australia first. Join the waitlist to get early access for yourself, or to list your practice as a LUME therapist partner.

No spam. No data sharing. Ever. That's not just a promise — it's the architecture.