Technology in mental disorder treatment

[7][page needed] Traditional methods of helping people with a mental health problem have been to use approaches such as medication, counselling, cognitive behavioral therapy (CBT), exercise and a healthy diet.

Rizzo et al.[11] have used virtual reality (VR) (simulated real environments through digital media) to successfully treat post-traumatic stress disorder (PTSD).

"Mobile devices like cell phones, smartphones, and tablets are giving the public, doctors, and researchers new ways to access help, monitor progress, and increase understanding of mental wellbeing.

If the app detects a change in behavior, it may provide a signal that help is needed before a crisis occurs" (Technology and the Future of Mental Health Treatment, n.d.).

"This form of surveillance is harmless since third-party companies are primarily interested in aggregate data and will use this information for the purpose of developing and marketing better products, which will benefit consumers in the long run".

Technology companies are developing mobile-based artificial intelligence chatbot applications that use evidence-based techniques, such as cognitive behavioral therapy (CBT), to provide early intervention to support mental health and emotional well-being challenges.

[17] Artificial intelligence (AI) text-based conversational applications delivered securely and privately over mobile devices have the ability to scale globally and offer contextual and always-available support.

A recent real world data evaluation study,[18] published in the open access journal JMIR mHealth & uHealth, that used an AI-based emotionally intelligent mobile chatbot app, Wysa, identified a significantly higher average improvement in symptoms of depression and a higher proportion of positive in-app experience among the more engaged users of the app as compared to the less engaged users.

Ethical and legal issues tend to not be explicitly addressed in empirical studies on algorithmic and data-driven technologies in mental health initiatives.

[26] Concerns have been raised about the near-complete lack of involvement of mental health service users, the scant consideration of algorithmic accountability, and the potential for overmedicalization and techno-solutionism.