Sensing Workplace Stress While Keeping Eye on Privacy
Personal sensing data could help monitor and alleviate stress among resident physicians, although privacy concerns over who sees the information and for what purposes must be addressed, according to collaborative research from Cornell Tech. The findings are published in Proceedings of the ACM on Human-Computer Interaction.
Burnout and Stress are Everywhere
Burnout in all types of workplaces is on the rise in the U.S., where �Great Resignation' and �silent quitting' have entered in recent years. This is especially true in the healthcare industry, which has been strained beyond measure due to the COVID-19 pandemic.
Stress is physical as well as mental, and evidence of stress can be measured through the use of smartphones, wearables, and personal computers. But data collection and analysis - and the larger questions of who should have access to that information, and for what purpose - raise myriad sociotechnical questions.
Tensions Around Personal Sensing Interventions for Stress in Resident Physicians
The resident physician's work environment is a bit different from the traditional apprenticeship situation in that their supervisor, the attending physician, is also their mentor. That can blur the lines between the two.‘There is a need to establish new norms around data-driven workplace well-being solutions, and provide privacy protection for the data taken.’
What the actual boundaries are there, or what it looks like when you introduce these new technologies is unknown. So, they need to try and decide what those norms might be to determine whether this information flow is appropriate in the first place.
Researchers addressed these issues through a study involving resident physicians at an urban hospital in New York City. After hourlong interviews with residents on Zoom, the residents and their attendings were given mockups of a Resident Wellbeing Tracker, a dashboard with behavioral data on residents' sleep, activity, and time working; self-reported data on residents' levels of burnout; and a text box where residents could characterize their well-being.
The residents were open to the idea of using technology to enhance well-being. They were also very interested in the privacy question and how they could use technologies like this to achieve those positive ends while still balancing privacy concerns.
The study featured two intersecting use cases: self-reflection, in which the residents view their behavioral data, and data sharing, in which the same information is shared with their attendings and program directors for purposes of intervention.
Among the key findings: Residents were hesitant to share their data without the assurance that supervisors would use it to enhance their well-being. There is also a question of anonymity, which was more likely with more participation. But greater participation would hurt the potential usefulness of the program since supervisors would not be able to identify which residents were struggling.
This process of sharing personal data is somewhat complicated. There is a lot of interesting continuing work that we are involved in that looks at this question of privacy, and how you present yourself through your data in more-traditional mental health care settings.
Therefore, there is an urgent need for further work establishing new norms around data-driven workplace well-being management solutions that better center workers' needs, and provide protections for the workers they intend to support.
Source: Eurekalert