Itâs three in the morning and my room is bathed in the glow of my phone. Like one in three people, I check my smartphone when I wake up in the middle of the night. I canât sleep and so wander from one social-media app to another, my thumbs scrolling through what feels like miles of emptiness. âSiri, what is the meaning of life?â I ask without thinking. âI have stopped asking myself this kind of question,â she answers. I ask again, because I like it better when she says ânothing Niestzche wouldnât teach youâ.
I am not the only one turning to Siri for life advice. Apple is currently recruiting a Siri engineer with a background in psychology to help make its virtual assistant better at answering these sorts of questions.
âPeople talk to Siri about all kinds of things, including when theyâre having a stressful day or have something serious on their mind. They turn to Siri in emergencies or when they want guidance on living a healthier lifeâ, says the job ad.
More than half of interactions with Amazonâs virtual assistant Alexa are ânon utilitarian and entertainment relatedâ, according to the company, a category that includes existential questions and confidences.
Google has a full âpersonality teamâ in the US composed of comedy writers, video-game designers and mysteriously named âempathy expertsâ, in charge of defining answers to complicated questions asked of Google Assistant. Microsoft, meanwhile, has an âeditorial teamâ responsible âfor crafting and creating Cortanaâs responses to make sure that all of our responses ladder up to Cortanaâs core personality pillarsâ.
Have we all gone crazy, whispering confidences into our electronic devices? I donât think so. Talking to my phone, I donât feel any different from Ross in the fifth season of Friends, asking a Magic 8 ball if he should stop seeing Rachel.
âDiscussions with our phones help us introspect,â says Alexandre Lacroix, a French philosopher who investigated how the internet disrupts our lives in Ce qui nous relie â JusquâoÃ¹ Internet changera nos vies (What connects us â How is the Internet Changing the Way we Live?).
âWe donât expect any precise answers when we ask Siri what the meaning of life is. We use it as tool in our quest for self-knowledge,â says Lacroix.
It seems like a new kind of diary. In the same way we feel free to write what we really think in the pages of a book, when we discuss our innermost feelings we now tend to disclose more talking to AI than to humans, according to a study conducted by the Institute for Creative Technologies in Los Angeles.
The difference is now the diary answers back â and records everything.
âThis is a tremendous opportunity in terms of mental health careâ, says Eleni Linos, an assistant professor of medicine at the University of California, who has advised Apple on how to improve Siri and co-authored a paper about how conversational agents such as voice assistants could improve our health.
âConversational agents can direct us to the right resource, when needed,â Linos says.
Amazon claims to have trained Alexa to answer in a compassionate and helpful way when asked about loneliness or depression, and to provide the number of a depression hotline. Google does the same in certain regions of the world.
Alison Darcy is a former senior researcher at Stanford and founder of Woebot, a psychology chatbot that uses cognitive behavioural therapy. She says chatting with an AI can have a very positive impact on mental health, maybe just by simply reminding us to take some time to reflect and introspect. But, she says, âfor ethical reasons, the patient must be aware of the science behind the serviceâ.
Microsoft, Apple and Amazon do not disclose much about how they shape their answers to existential questions. Nor do we know much about the precise role of psychologists and the science they rely upon.
âThis is critical because psychology is highly political,â says Luke Stark, a sociologist at Dartmouth university. âYour conversational agent may know way before you [do that you have] a mental illness.â
Research is underway at MIT, exploring how mental illness could be diagnosed just by analysing the way you speak. It seemed far-fetched to me at first, but this is only the beginning of what artificial intelligence can teach us about ourselves.
âConsidering how sensitive this data is, it should be as protected as medical files,â adds Stark. Amazon, Microsoft, Apple and Google all keep track of your utterances to conversational agents for various amounts of time, in order to customise the experience.
My conversations with Siri are way more serious and sensitive than I thought. Especially considering that while hiring a Siri Health engineer, Apple is also looking for a âbehavioural data scientistâ whose responsibilities among others are to âtranslate insights into products and programs that consumers engage with and help drive behaviour changeâ and âunderstand how digital tools and software can influence behaviourâ.
Sure, it does not say Apple analyses my Siri conversations to influence my behaviour or manipulate me, without me being aware. But it does mean it is, in theory, possible.