Could safety in the workplace be a pretext to control workers? (BLOG)


Could safety in the workplace be a pretext to control workers?

Foto blog 6 (Security camera from pixabay)

To date, when companies have installed control devices (cameras, GPS, combined artificial intelligence systems, etc.) they have alleged reasons such as better security in facilities (theft, fire, etc.), process quality of general control of work tasks. However, certain cases have been identified where the safety and health of the workforce is the main reason for this kind of measure by the management. It is interesting to point out what Spanish legislation says here: “Employers shall permanently monitor preventive actions for the continued improvement of […]existing levels of protection” (article 14.2 of Act 31/1995 dated 8 November 1995 on the Prevention of Occupational Risks). Nevertheless, in the area of prevention of occupational risks there are many situations in which the control and supervision of activities by the company cannot be easily done in a direct or presence-based manner.

New practices of workplace supervision seem particularly innovative to us, as was the HAL 9000 super-computer Stanley Kubrick created in his film 2001: A Space Odyssey. Through artificial intelligence, HAL (Heuristic Algorithmic computer) could detect emotions and suffering, and controlled all the systems in the spaceship, and even its crew members. Like HAL, with the aim of improving the atmosphere at work, some companies installed cameras that only allow workers to enter the premises if they smiled at the camera. This is the case of the offices of Canon in Beijing (China), which installed AI cameras that prevent a person from doing any operation unless they detect a smile on his or her face.

These devices are small screens with cameras installed on doors, printers and in meeting rooms, among other places. If the camera does not detect a smile, it shows an emoji at the top left of the screen, encouraging the person to be happy, but without taking their real mood into account. Employees who do not smile have all kinds of restrictions placed on them, for example, they cannot print documents, program a meeting, enter certain rooms or change the temperature in the room.

Another example is the surveillance carried out by some companies to ensure safe driving. A few months ago Amazon announced that it would start to install surveillance cameras based on artificial intelligence in its vehicle fleet, saying that this would “improve the safety” of delivery drivers The cameras (now installed in half of Amazon’s fleet in the USA) automatically record ‘events’ such as traffic offences committed by the drivers. Every time an event is recorded, the camera sends the images to the company so that the driver can be evaluated. The camera does not just record and send events, a metallic voice reprimands the driver (“distracted driver!”) every time he/she does something wrong. If a camera records more than 5 events per 100 trips the worker could automatically lose the bonus on which many delivery drivers depend.

Even more surprising is the case of the company H&M in Nuremberg (Germany), where “Welcome Talks” are organised for workers who return to work after a short time off (due to illness, for example). After these talks, not only are the particular experiences of the workers recorded, their symptoms and diagnoses are too. This means that some supervisors get to know detailed information about the private lives of employees, from quite inoffensive details to family problems and religious beliefs. Some of this information was recorded, digitally stored and made partially available to a large number of managers in the company. As well as a detailed assessment of the individual’s performance at work, the data collected were used, among other things, to obtain a detailed profile of the employees in relation to the measures and decisions related to their jobs. This compilation of data was made known because they were available throughout the company for several hours in October 2019 due to a settings error. The company was fined 35.3 million euros in 2020 by the Hamburg Commission for violating data protection regulations since 2014.

While safety at work is a duty that the company should comply with, and it can justify the treatment of data of its employees (article 6.1. c) and article 9.2 b) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data), the measures should be of a preventive nature, i.e. to avoid or reduce risks in the workplace. In the above examples there is no risk of replacement of the person employed by the emerging technology. Nevertheless, the emerging technology could increase the pace at which the worker has to carry out his/her tasks, which could harm his/her safety and health.

Furthermore, these measures should be justified in a risk assessment prior to their adoption, and the principle of proportionality of the risk should be extended as it could affect other fundamental rights (e.g. privacy and the protection of personal data). Likewise, workers’ representatives should have participated in the adoption of these measures.

While we await the approval of the proposal for a Regulation on artificial intelligence, which would apply to artificial intelligence systems aimed at interacting with individuals, systems for the recognition of emotions and biometric categorisation, and some precautionary measures against these practices in future reforms of Workplace Prevention Regulations, collective bargaining mechanisms could agree some measures. For example, in a collective agreement it could be agreed that the company would not apply sanctioning powers when the initial and informed use of these measures would be for safety at work reasons.

In this regard, we can quote cases in collective bargaining that justify a degree of control (e.g. GPS systems) for the safety and health of workers, but whose results could be used to apply sanctions. This is the case of article 34 del III Collective Agreement of Enercon Windenergy Spain, SL. BOE [Official Gazette of Spain] of 23.11.2020 which states: “Control of company vehicles. The company has a GPS localisation system in all the vehicles of EWS placed at the disposal of its workers. The Company sets out to guarantee a more efficient organisation of its fleet, better coordination of technical teams and the safety and health of its workers. The installation of these devices does not aim to monitor the usual activity or behaviour of the workers. However, while always respecting legal principles, the information provided by the GPS system may be used for the application of the company’s disciplinary regime, giving rise to minor, serious or very serious offences in the area of driving that are verified by the data obtained through the GPS system”.

In a similar vein, the collective agreement could prohibit the use of photographs or videos as a means of surveillance of work done off-site, as in the Remote Working Agreement of Adif dated 16 June 2021. The collective agreement could establish this measure, taking into account the singular features of remote working and the flexibility that characterises this form of organisation of tasks. In this sense, it should be said that some employment contracts agree control based on cameras with artificial intelligence in the homes of the employees, plus voice analysis and the storage of data compiled by family members of the employee. See the case of Teleperformance.

Finally, negotiators could limit the combination of invasive technologies. Collective agreements could pay special attention to the combined use of video surveillance. This technology allows the company, through video surveillance, to observe the facial expression of its employees automatically and identify deviations from predefined movement patterns, etc. This would be disproportionate in terms of the rights and freedoms of workers and, therefore, illegal. The processing of these data could involve the construction of profiles and, possibly, automated decisions. Therefore, collective bargaining could establish that video surveillance cannot be used in combination with other technologies such as facial recognition, because in this case the level of surveillance is disproportionate in the light of European and national recommendations (Working Group on Article 29, 2017 and Spanish Data Protection Agency, 2021).

Written by Ana B. Muñoz Ruiz (UC3M)

Originally published in Spanish on El Foro de Labos

© Photo: Peggy und Marco Lachmann-Anke (retrieved from Pixabay)