Skip to content

Meditation and mental health apps collect more data than any other

During the pandemicmeditation apps and mental health, which have served to calm anxiety and even contact psychologists, have been in high demand. However, the personal data of its users could be exposed. According to the guide “Privacy not included”, prepared by researchers from Mozilla, 28 of the 32 apps received a warning label. Among the apps evaluated are Talkspace, Better Help, Calm and Glorify.

The report warns that even though these apps deal with incredibly sensitive issues like depression, anxiety, suicidal thoughts, domestic violence or eating disorders, they routinely share data, allow weak passwords and target vulnerable users with personalized ads. Also, their privacy policies are vague and poorly written.

The report concludes that the apps with the worst privacy and security are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace, due to their incredibly vague and messy privacy policies, because they share personal information with third parties, and because of collect even chat transcripts (Talkspace).

Main findings

From this investigation, at least four conclusions were reached:

Companies do not respond. Mozilla He claims to have emailed all the companies at least three times (to the privacy contact listed in their privacy policy) to try to get responses, and only one company (the Hallow Catholic ‘app’) responded in a timely manner. After several attempts, a response was received from two others (Calm and Wysa).

Mental health apps are a data collection bonanza. Nearly all of the apps reviewed gobble up more personal data than researchers at Mozilla have seen in connected apps and devices. Additionally, some apps collect additional data from third-party platforms (such as Facebook).

Security is sometimes ridiculous. At least eight applications allowed weak passwords ranging from “1” to “11111111”. Moodfit only required a letter or digit like passwordwhich is worrying for an application that collects data on mood and symptoms of a person. We also had trouble determining if the ‘apps’ had a way to manage the security vulnerabilities found.

Teenagers are especially vulnerable. Many prayer and mental health apps target young people, including teenagers, a demographic group that suffers most from mental health issues. When teens share information about these apps, it could be leaked, hacked, or used to target them with personalized ads and marketing for years to come.

Of all the apps analyzed, only two met basic standards privacy and security; PTSD Coach, an app created by the US Department of Veterans Affairs, and the AI ​​chatbot Wysa.

Source: Elcomercio

Share this article:
globalhappenings news.jpg
most popular