In the area of mental health services for Coloradans, at least three forces are altering the landscape at once. One is the mandate that insurers must offer parity between mental and physical health coverage. Another is the severe ongoing shortage of mental health professionals to meet growing patient demand—including the increasing numbers of older Coloradans who are dealing with mental health challenges. And a third force is the emergence of various technology “substitutes” for patients to consult in lieu of seeing professional therapists in person.

One question this raises is how well, or not so well, tech tools perform as mental health “therapists.”

NAMI Colorado (National Alliance on Mental Illness) has taken steps to help people navigate what are known as mental health apps. As startling as it sounds, the Kaiser Family Foundation (KFF) reports that over the past few years, 10,000 to 20,000 apps have entered the mental health space “offering to disrupt traditional therapy.” Thanks to AI (artificial intelligence) innovations, KFF anticipates that chatbots to offer mental health care are sure to proliferate as well.

As for the already existing apps, NAMI notes that many of them promise to improve your mood, decrease your anxiety or connect you with a live therapist. “Some even promise to boost your happiness by a certain percentage in just a few days,” says NAMI, and such bold claims are common. “But how well do these tools deliver on their flashy promises? And are they readily accessible for people with mental illness seeking digital support?”

Drawbacks for Coloradans to be aware of when using Apps and Chatbots for Mental Health “Therapy”

KFF states that in the mental health space, evidence of effectiveness of the apps is lacking, and “few of the many apps on the market have independent outcomes research showing they help; there is evidence that some can do harm. Most haven’t been scrutinized by the FDA, even while being marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression.” The FDA has, in fact, said it intends to boost its enforcement discretion over mental health apps and will vet them as medical devices. So far, not one has been FDA-approved, and very few have been treated as “breakthrough” devices by the agency, which would fast-track studies and reviews for them.

KFF further notes that many apps warn users, typically in small print, that the app “is not intended to be a medical, behavioral health or other health care service.”

NAMI points to research at Harvard Medical School that suggests there are issues and limitations that app users need to be aware of. Some apps may be an effective supplement to mental health treatment if users find the right program to fit their individual needs, “but before downloading mental health apps, users must consider the drawbacks of relying on digital care.” NAMI lists the following:

Accessibility. Although app searches for selected mental health problems have risen dramatically in the last year, some users face accessibility barriers. Features in the app may be hidden behind a paywall, meaning you have to pay a hefty subscription fee to access them. Apps claiming to be “free” may be offering only a limited version to entice you to sign up for the paid version.

Consistency. App creators often have difficulty retaining a user over time, making it problematic when it comes to an individual user being able to track symptoms and self-manage their condition. Mental health treatment very often requires consistency over time to be effective.

Specificity. Perhaps the most concerning challenge surrounding mental health apps, NAMI states, is the lack of specialized support for commonly searched mental health conditions. “Our research found that an app store search using terms like ‘schizophrenia’ or ‘bipolar disorder’ does not return a significant number of apps that offer specialized education or support for these conditions.”

Credibility. How do apps document the efficacy of their features? Users need to be aware that some mental health apps make exaggerated claims that they say are “evidence based” when they are not Also, apps may claim to follow legitimate treatments that help people even if, in some cases, that is not the case.

Privacy. Users need to think about data privacy. “In the absence of stringent FDA oversight,” says NAMI, “even widely downloaded apps have raised concerns around undisclosed sharing, access and use of mental health data.”

Coloradans should think of apps as a complement to care

Notwithstanding these concerns, NAMI acknowledges that apps have potential to help people, especially when used in conjunction with professional treatment. “It’s essential to think of apps as a means to boost and extend care, not replace it,” NAMI points out. But finding the right app is the challenge. “It can be overwhelming to comb through every search result to find the one with the tools and features you’re seeking.” And while reviews may indicate popularity, the question is whether an app has “clinical utility.” I.e., does it actually help?

So the question becomes: How can you find the right app?

NAMI has developed a searchable database of mental health apps it calls MIND (the Mental health Index and Navigation Database). See It allows you to sort through mental health apps using the criteria that matter most to you. To sort and focus your search, you can choose from more than 100 different filters across various areas, ranging from privacy to cost to clinical foundation.

“Ultimately, we hope this can be a tool to help individuals find the right mental health app given their unique preferences,” NAMI says. MIND does not designate any app as “good” or “bad,” but the searchable database can identify which apps meet a potential user’s criteria. Then keep in mind the cautions about drawbacks as mentioned earlier.

What about those AI chatbots KFF sees coming?

KFF notes that Joseph Weizenbaum, considered one of the fathers of artificial intelligence, predicted AI “would never make a good therapist, though it could be made to sound like one.” Weizenbaum’s initial AI program, dubbed ELIZA when created in the 1960s, was a “psychotherapist” that used word and pattern recognition combined with natural language programming to sound like a therapist. Not unlike how something like ChatGPT works today.

But Weizenbaum himself even at the time was disturbed by people interacting with ELIZA as if she were a real therapist. In his view, “experiences a computer might gain are not human experiences. The computer will not, for example, experience loneliness in any sense that we understand it. The same goes for anxiety or ecstasy.” KFF quotes Bon Ku, a pioneer in medical innovation, echoing Weizenbaum’s perspective. Ku states, “The core tenet of medicine is that it’s a relationship between human and human, and that will never be replaced by AI.”

The temptation for insurers

KFF suggests that as apps and chatbots continue to proliferate, they will likely prove tempting for insurers as they try to meet the mental health coverage parity requirement. “After all,” says KFF, “that would be a cheap and simple solution, compared with the difficulty of offering a panel of human therapists, especially since many [of those therapists] take no insurance because they consider insurers’ payments too low.”

Where Coloradans can get mental health help?

However the evolution of mental health apps and chatbots progresses, Coloradans will do well to consider the several pros and cons of such technology. And although it’s true that our state shares with other states nationally the challenge of having a truly robust mental health workforce, Colorado does have a substantial network of mental health professionals and services in place and available for you. You can learn all about them on our AgeWise Colorado website at