Mozilla: Mental Health Apps Fail at User Security
25 out of 32 apps examined do not even meet the minimum standards. Mozilla rates only two apps as good. Researchers call it the worst product category they’ve studied so far.
According to Mozilla’s study, mental health apps and prayer apps often pose a risk to user security and privacy that should not be underestimated. Despite the very sensitive data these apps manage, they are said to “routinely” share data with third parties, allow weak passwords, and also personalize ads to vulnerable users.
For the study, titled “Privacy Not Included,” 32 apps focused on mental health and religion were examined. Twenty-five of those apps did not meet Mozilla’s minimum standards for security and privacy. The apps were evaluated on policies for handling user data, encryption, password policies, sharing or even selling data, and other lax security requirements.
“When it comes to privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have studied in the last six years,” is Mozilla’s crushing verdict.
Mozilla classifies only two apps as worthy of consideration
Among others, the app Better Stop Suicide, which is supposed to prevent suicides, failed the tests. Requests from Mozilla about the “trusted partners” who have access to user data were not answered by the developers. The privacy policy was also rated as “poor.”
Only two of the apps reviewed, PTSD Coach and AI chatbot Wysa, take their users’ privacy seriously, according to Mozilla. “The vast majority of mental health and prayer apps are exceptionally creepy,” commented Jen Caltrider, who led the Mozilla study. “They track, share, and use users’ most intimate personal thoughts and feelings, such as moods, state of mind, and biometric data. It turns out that researching mental health apps is not good for mental health, as it reveals how careless and cowardly these companies can be with our most intimate personal information.”