How do you feel about workplace algorithms that recognize your feelings?
The little-known field of affective computing is a growing presence in our lives, including the workplace. Is it OK to have your feelings monitored if it’s supposed to enhance your well-being?
Pretend you’re a manager who can see how your team members are feeling, in real time. You could gauge their mood—after an influx of new employees, say, or an organizational change—and act accordingly.
That’s the goal of affective computing research being led by Pierrich Plusquellec, a professor at Université de Montréal’s School of Psychoeducation, and Pamela Lirio, a professor at its School of Industrial Relations.
We spoke to them to find out more.
What is affective computing?
Pierrich Plusquellec: It’s a scientific discipline that connects computing and feelings in two ways: by creating machines that effectively simulate emotions and by recognizing different emotional states.
There are algorithms today that can detect our feelings in real time based on the sound of our voices, our facial expressions, our movements and even the way we type on a keyboard. This field of research was launched less than 30 years ago by Rosalind Picard and is generating a lot of buzz in the industry today. It’s got tremendous potential. Affective computing is everywhere: it can even be found in the phones we’re so addicted to.
How is affective computing used by various organizations today?
Pamela Lirio: One example is in call centers, where it’s used to identify the angriest clients, who are then sent to more experienced employees. In this case, it’s used to perform an initial screening.
It’s also used by human resources. Big companies like Vodafone, Hilton, Urban Outfitters and Unilever use automatic facial expression recognition to screen candidates who are expected to submit video applications. This helps them create a shortlist from an especially large pool of candidates that recruiting teams would otherwise lack the resources to handle. They do this through recruiting platforms like HireVue that use affective computing. However, it’s still best practice for HR professionals and/or the frontline manager to make the final hiring decision after looking at the data provided.
There are many ways this technology could be misused, aren’t there?
Pierrich Plusquellec: It all depends on how we use it! It’s just a tool, like a knife, which can be used to sculpt glorious works of art or to kill someone. The risk is real: affective computing can be horribly misused. I’ve read in Le Point, the French newsmagazine, that facial or emotional recognition technology is used to torture Uyghurs in China.
Meanwhile, closer to home, the commercial real-estate company Cadillac Fairview used facial recognition on unwitting customers to boost sales at Carrefour Laval mall. It installed cameras to capture the facial expressions of shoppers and assess their mood. This is illegal, so the company had to stop.
That’s why this kind of work has to be done in a way that’s caring and ethical, and why users have to have control over their data.
How can that be done?
Pierrich Plusquellec: Well, for example, during the pandemic, many people didn’t pay enough attention to how they felt. As a result, a lot of them now have misdiagnosed emotional disorders, since there weren’t enough resources to take care of them. Mood disorders in the workplace, such as anxiety, depression or burnout, are related to this loss of emotional control.
Frédéric Lenoir, Mathieu Ricard, and many another researcher on wellness all agree: if you want to be happy, you have to pay close attention to your feelings. But not everyone has the time to meditate or look deep into their emotions.
This is what inspired our interdisciplinary team—which includes Nathe François, Ted Hill, Noël Rignon and Vincent Gautrai—to create software we call EmoScienS. It’s now in the pre-commercialization phase. You can launch it on your computer, and it will capture your emotions every five minutes, in real time. Turn it on whenever you want and at the end of the day you can see your feelings displayed on a dashboard. You could see whether you were happy, angry or frightened, and when.
If, for example, you see you were angry from 10 a.m. to 11 a.m., you could check to see what you were doing at that time. This ability to link actions and feelings is the secret to well-being, an insight that Nobel Prize winner Daniel Kahneman shared with us years ago.
How can the software be used within an organization?
Pamela Lirio: We give users the option of anonymously sharing their dashboard with their manager. If more than 10 people agree to share their data, it’s aggregated and sent to their manager. If, for example, their manager notices that the team is feeling particularly upset, they can act accordingly.
To be ethical, products like this have to anonymize data and users have to have full control over the software. That means the data belongs to users and they decide whether to share it. It also means that the data shouldn’t be used to make automated decisions. What it should do is allow everyone to better understand the feelings they’re expressing and what they’re doing while they’re in front of their monitors (social media, work, online meetings, etc.). Furthermore, the EmoScienS project adheres to the principles of the Montreal Declaration for a Responsible Development of Artificial Intelligence and advocates for making AI startups comply with principles of ethics, transparency and social responsibility.
If an organization can assess its emotional climate in real time it gains the power to do something about it. And by working with managers and employees, we hope to enhance organizational wellness—for example, they could set up a relaxation room, or instill even more lasting changes to foster wellness among employees. Our software could show how this affects their mood in real time.
Citation:
How do you feel about workplace algorithms that recognize your feelings? (2023, April 27)
retrieved 27 April 2023
from https://techxplore.com/news/2023-04-workplace-algorithms.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.