List #19 - A.I. can't detect our emotions but the industry for it is booming
Things I’ve been reading around the web…
🖥 A.I. Can’t Detect Our Emotions A conversation with the professor who just turned down a $60,000 grant from Google, Evan Selinger, OneZero.medium.com (6 April 2021)
Hope runs so high for these endeavors that the projected market value for emotional A.I. is $91.67 billion by 2024.
🤖 AI therapists are on the rise—but can they really fix us?, Boyd Farrow, WIRED (15 July 2021)
According to Aggarwal, the company’s next development phase involves cracking voice recognition as it rolls out its products globally “to create access to mental health for the next billion.”
🔘 Sensors and AI to monitor Dorset social care patients, Chris Baraniuk, BBC (25 August 2021)
Sensors installed in homes will track behaviour and electricity usage which the AI will analyse to spot potential health problems.
📱 U.S. prisons mull AI to analyze inmate phone calls, David Sherfinski and Avi Asher-Schapiro, Reuters (10 August 2021)
The call for the Department of Justice (DOJ) to further explore the technology, to help prevent violent crime and suicide, accompanies an $81 billion-plus spending bill to fund the DOJ and other federal agencies in 2022.
In scholarship…
🥼 Technology and Psychiatry ($), James Phillips, The Oxford Handbook of Philosophy and Psychiatry (2013)
🔁 The reproducibility crisis in the age of digital medicine, Aaron Stupple, David Singerman & Leo Anthony Celi, NPJ Digital Medicine (January 2019)
💡 Can digital data diagnose mental health problems? A sociological exploration of ‘digital phenotyping’ ($), Rasmus H. Birk, Gabrielle Samuel, Sociology of Health and Illness (November 2020)
💭 Novel Readings: Mind- and Emotion-reading Devices in the Mid-twentieth Century and in Philip K. Dick's The Three Stigmata of Palmer Eldritch, Chris Rudge, Portable Prose: The Novel and the Everyday (2019).
Something I’ve written…
📚 P Gooding and T Kariotis, ‘Ethics and Law in Research on Algorithmic and Data-Driven Technology in Mental Healthcare: Scoping Review’ (2021) JMIR – Mental Health
Tim and I reviewed all applied studies that used algorithmic and data-driven technologies in ‘online mental health interventions’. Among the findings:
Ethical and legal issues were not addressed in empirical studies on algorithmic and data-driven technologies in mental health initiatives;
Only 3% of the field appeared to involve people with lived experience of mental health care, mental health conditions and disability, in the design, evaluation or implementation of the proposals in any substantive way;
The form itself of peer-reviewed papers that detail applied research in this field may well preclude a substantial focus on ethics and law, and technologies could be appropriated into practice in rarely acknowledged ways, with serious consequences.