In today’s increasingly digital health landscape, the promise of patient engagement software is clear: reduce no-shows, streamline intake, improve patient outcomes, and, perhaps most critically, retain patients. As a therapist and social worker, I’ve recently been exploring platforms like Morf, which offer elegant, user-friendly solutions to managing client flow and contact points—from scheduling to automated follow-ups.
But I’m also curious about what lies beneath the surface. Is “engagement” always as friendly and supportive as the word implies? Or can it take on more coercive or punitive dimensions—ones that subtly pressure patients to conform to systems that may or may not serve their individual needs?
When Engagement Has Teeth
We often think of engagement as a soft concept: encouragement, rapport, rapport-building. But true engagement, especially in systems designed to drive outcomes, sometimes requires teeth. What happens when a client continuously misses appointments? When a digital nudge becomes a warning? When the language of support shades into the language of compliance?
A well-designed engagement platform may include things like:
- Automated check-ins that encourage participation.
- Reminders that feel personal but are powered by algorithms.
- Retention metrics that could be used to evaluate both patients and providers.
But these tools, while helpful, can also create an atmosphere of subtle surveillance or pressure. For some clients—especially those with trauma histories or complex relationships to authority—the “nudging” might feel more like pushing.
Holding Patients Accountable—or Enforcing Norms?
As clinicians, we walk a delicate line. We want to help clients stay engaged in treatment, but we also want to respect autonomy. Patient engagement software offers powerful tools—but how we use them matters.
Could these tools reinforce inequities, or pathologize disengagement? Are there ways to use software that honor the messiness of real life, rather than streamlining it out of the picture?
Beyond Efficiency: Building Ethical Infrastructure
I believe the future of mental health technology should involve deeper conversations about how we engage—not just how often. Can we design systems that are flexible, relational, and trauma-informed? Can software serve as an extension of therapeutic values rather than a silent enforcer of productivity metrics?
I’m looking forward to learning more about Morf, and exploring whether it can support a more just, humane, and meaningful kind of engagement—one that values both connection and consent.
Max E. Guttman is the owner of Mindful Living LCSW, PLLC, a private mental health practice in Yonkers, New York.
- Max E. Guttmanhttps://mentalhealthaffairs.blog/author/max-e-guttman/
- Max E. Guttmanhttps://mentalhealthaffairs.blog/author/max-e-guttman/
- Max E. Guttmanhttps://mentalhealthaffairs.blog/author/max-e-guttman/
- Max E. Guttmanhttps://mentalhealthaffairs.blog/author/max-e-guttman/