Legal report
How employees can influence the use of algorithms in their workplace – a legal perspective
Algorithmic systems are increasingly being used in workplaces in Switzerland. Under the current legal framework, companies should involve their employees in certain decisions. However, this is often not the case in practice. This report shows which rights employees have as well as which obligations employers have when it comes to employee participation, what gaps can be identified in the legal framework and how these could be closed.
Overview
- How are algorithmic systems used in the workplace today?
- What is the current legal framework?
- Which actions are needed to adapt the legal framework?
- Which ethical aspects need to be considered?
Swiss companies are increasingly relying on the use of automated decision-making (ADM) systems or algorithmic systems in the workplace. These systems are used, for example, in recruitment, to monitor employees or to guide processes. However, employees are often not involved in setting up these systems – whether in the planning, implementation or operation.
In this legal report, Prof. Dr. Isabelle Wildhaber, LL.M. and Dr. Isabel Ebert from the Institute for Work and Employment Research at the University of St. Gallen (FAA-HSG) have examined how employee participation in the use of algorithmic systems is currently regulated by the law. They also show what actions to adapt the legal framework are required and highlight the relevant ethical aspects in addition to the legal ones. The report was commissioned by AlgorithmWatch CH and syndicom as part of the project "Analytics for the People? What algorithms at the workplace mean for worker rights and participation".
How are algorithmic systems used in the workplace today?
The report outlines the context of how the legal framework operates in practice based on two empirical data surveys of Swiss companies as well as qualitative case studies at five large Swiss corporations. While the use of algorithmic systems is already widespread in Swiss companies, and has increased according to the surveys conducted from 2018 to 2020, various problem areas are emerging in terms of employee participation, such as:
- Information: employees are not widely and clearly informed about the use of ADM systems, which leads to employees feeling blindsided when the systems are introduced. Companies could launch campaigns that inform and raise employees’ awareness of how their data is used, for example.
- Consultation: Only informed employees can be involved in a consultation process. However, consultation processes are often not formalized and/or only take place during the pilot phase.
- Attribution of responsibility: The use of ADM systems can lead to a diffusion of responsibility - managers do not feel responsible for automated decisions, but employees continue to see the responsibility sitting with the managers.
- Learning culture vs. sanctions culture: The way technology is used in the workplace has an impact on whether it is empowering (data and analytics can be used to improve oneself) or punitive (employees are sanctioned based on data and analyses).
What is the current legal framework?
The uses of algorithmic systems in the workplace and employee participation are regulated by various legal provisions, in particular:
- Right of participation
- Labor law
- Data protection law
- Health protection
- Protection against discrimination
Data protection requires consent at an individual level when employers use their employees' personal data. Employers must obtain consent from employees if their personal data is to be used for a specific purpose.
The Participation Act regulates basic approaches to employee participation at company level, specifically information and consultation. In Swiss law, the “company constitution” regulates the fundamentals of employee participation. It regulates the cooperation between the employer and employee representatives. Co-determination rights can also be regulated in collective employment agreements, at company level and in special law.
The Participation Act defines the internal employee representation, the collective body of the workforce of a company. Trade unions represent the interests of employees within the company. Where there is no employee representation, employees have an individual right to participation:
- Right to information: this is the basic prerequisite for more extensive forms of participation - employees must be informed about matters they need to know about in order to perform their duties. In addition, there are data protection information rights at an individual level.
- Right to have a say: This applies in particular to all health and safety issues. ADM systems that can be used for monitoring should fall into this category, and perhaps essentially all ADM systems. Employees would then have appropriate participation rights to help reduce the negative effects on health.
- Co-decision rights: These only exist in relation to certain topics (e.g. Sunday work), where employee representatives have a right of approval or a right of veto.
An important principle is that collective participation rights cannot be waived by individual consent. Participation and health care in the workplace should be collective, so that individuals who raise objections, for example, are not exposed to the risk of dismissal. However, the existing collective rights to participation are hardly known and are therefore rarely applied, as the empirical part of the legal report shows.
Which actions are needed to adapt the legal framework?
According to the report, there are various gaps in the current legal framework. Related to the participation of employees and employee representatives, these include:
- Lack of sanctions.
- Dismissal of employee representatives is possible for economic reasons.
- It is unclear whether all ADM systems that can be used for surveillance are health-related and therefore subject to participatory decision-making.
- Unclear right of action for employee representatives.
The report also identifies various gaps in the possibilities for lodging objections:
- Individual legal enforcement is difficult, as it is often unclear which individuals are affected by surveillance and discrimination. But even if they can be identified, the existing instruments are often not sufficient or appropriate (for example, because the benefits of litigating are too low for an individual)
- Collective law enforcement is also rather weak - labor inspectorates only intervene when there are already harmful effects on health, the procedural barriers for legal action around participation rights are too high, and there are usually no sanctions for violations
The analysis in the report shows various options for closing these gaps. The non-exhaustive list includes proposals for the following areas:
- Strengthening the rights of employee representatives and associations (trade unions) to oblige employers to provide collective information on the use of ADM systems.
- Improving the right to have a say for employees and their representatives in relation to ADM systems.
- Improving the structures for supervision and control of the law.
In addition to legislative revisions to guarantee minimum rights, more far-reaching solutions can also be sought within the framework of social dialogue. For example, the collective interests of employees in the use of ADM systems in the workplace could be increasingly incorporated into collective labor agreements.
Which ethical aspects need to be considered?
The use of algorithmic systems in the workplace does not have to be regulated by the legal framework alone. Employers can also be guided by ethical principles. The report details this ethical perspective in several areas:
- Data collection and data evaluation: as the case studies show, the involvement of employees is relevant to the acceptance of algorithmic systems. Employers could therefore go beyond the legal framework in this area, when it comes to considering employees' concerns and criticism regarding the use of their data. This can help to ensure that the data collected is useful.
- Data protection: Employers could go beyond data protection regulations and take a more far-reaching right to privacy as a basis for their processes and involve employees to a greater extent.
- Surveillance: From an ethical point of view, it is also in the interest of employers not to burden the mental health of their employees with unnecessary surveillance measures. Mentally and physically healthy employees are absent less, can perform better and are more productive.
- Protection against discrimination: Current anti-discrimination legislation is insufficient to address the risks of discrimination posed by ADM systems in the workplace. Employers should therefore ensure that data sets used are representative in order to reduce negative consequences for certain individuals and groups.
In addition, the report lists the following ethical main points of analysis when using ADM systems in the workplace that employers should consider:
- Deep insights into privacy and possible monitoring of exchanges/organization among employees.
- Informing and consulting employees at an early stage.
- Data competence and data literacy of employees.
- Data transparency to enable participation.
- Informing employees about the scope of application of ADM systems.
- Avoidance of "black box systems".
A duty of care for employers in the context of ADM systems in the workplace, based on the UN Guiding Principles on Business and Human Rights UNGPs, would also be a possible solution. This would involve the following four steps:
- Identifying and evaluating impacts to assess the nature and extent of human rights risks (e.g. in the form of an impact assessment), which would include a collective employee consultation process.
- Acting to prevent and mitigate human rights risks, including through integration into internal functions and processes.
- Tracking the effectiveness of risk mitigation measures over time, by measuring impact and reviewing appropriateness, and providing feedback to stakeholders such as employees.
- Appropriate communication of measures to address human rights impacts, e.g. publication of the impact assessment.
The report was commissioned by AlgorithmWatch CH and syndicom as part of the project "Analytics for the People? What algorithms at the workplace mean for worker rights and participation".
Project partner: