As part of the broadening access to wellbeing thesis, we have been searching for projects that bring down the cost of therapy and coaching. 8% of adults and 12% of teens (and as high as 20% of teen women) suffer from depression in the US, but traditional therapy costs $75-150 an hour and online therapy like BetterHelp and Talkspace cost $150-$300/month, which are unaffordable for most US households.
One way to approach this challenge is building AI chatbots that act as personal therapists. This is a great approach, but I also think there is an opportunity to build a new kind of therapeutic relationship between a human and a computer that looks different than the traditional relationship between a human patient and a human therapist. Such a model would lean on what a computer can do really well that would be difficult for a human to do, such as ingesting a large amount of unstructured data and deriving insights.
One such model that seems potentially very interesting is software that passively ingests messaging, location, activity, sleep and phone usage and offers observations and guidance in a helpful (and privacy preserving) manner.
One example could be a calendar that passively notices that you haven’t been attending your scheduled plans and helpfully asks if you’d like to set aside time to journal or call a friend. Alternatively, a calendar that has access to your messages and offers you suggestions for what to talk about with a friend before you go to meet them.
Another example could be a keyboard that offers inline observations about your mood and reminds you to stop and breathe if you’re about to send an angry message.
A third example could be a fitness app that notices that you haven’t left your house for a few days and offers you a nearby activity you might like to walk over to.
People are building all sorts of passive observational guidance tools that could potentially be very impactful:
- Mei and Actual are both building messaging clients that offer advice for how to better connect with the people you talk to. Mei, for example, after watching my messaging threads, suggested I take more notice when my friends get worked up about something.
- Maslo and Scribe are both journaling apps that also over time offer insights about your mood and surface the common themes of your journal entries.
- Sonic Sleep Coach is an alarm clock that suggests personalized improvements for how you can get better sleep.
- Just Not Sorry is a Gmail plugin that helps you draft emails that more strongly communicate your thoughts and ideas.
I’m sure there are others.
Two benefits of a software-only solution are that software is stigma and judgement free, and can be low cost for the user. However, it’s important to be careful that these observations and suggestions are helpful and not hurtful, and so there could be systems that start with or even forever have humans in the loop.
One big question is how to build therapeutic, personal software that is helpful without being creepy. The answer may be to set up technological or business structures that guarantee the user’s trust. In a recent USV meeting, we talked about this idea by Muneeb Ali as the difference between a business with the mission Don’t Be Evil and a business with a structure that Can’t Be Evil.
One example of a Can’t Be Evil technical structure is running data analysis and machine learning on device so that the company has no access to users’ private information. Another is encrypting data on device a la iMessage so that even if data is stored on the company’s servers, the company does not have access to it.
Another example of a Can’t Be Evil business structure is building a governance system that gives some of the control over to users. Another is having a business model that aligns the users’ interests with the business’s such as users paying directly for the product via a monthly subscription, even if it is low cost, such as $1/year.
I’m curious to hear others’ thoughts on therapeutic products that work through passive observation and interested to see what others are building. It seems important to build a world where anyone, anywhere can get access to tools for wellbeing and finding new models to deliver those tools feels like a big piece of solving that access puzzle.