#work (11 results)

HUMAN

People must ensure that AI serves the people

When the police try to predict the likelihood of offenders’ criminal recidivism with algorithms, when employers use AI to pre-sort job applications, or when an AI chatbot creates media content: who is affected by this algorithmically generated decisions, recommendations, and content? How to involve those affected in order to defend their rights and interests?

Training measures

What trade unions need to know about algorithms in the workplace

Companies in Switzerland are increasing their use of algorithmic systems in the workplace. Worker participation is an important factor to make sure that this technological transformation has positive effects on employees. This publication sums up what trade unions need to know about algorithmic systems in the workplace and how they can enable employee participation.

Legal report

How employees can influence the use of algorithms in their workplace – a legal perspective

Algorithmic systems are increasingly being used in workplaces in Switzerland. Under the current legal framework, companies should involve their employees in certain decisions. However, this is often not the case in practice. This report shows which rights employees have as well as which obligations employers have when it comes to employee participation, what gaps can be identified in the legal framework and how these could be closed.

The 5 Best Podcasts on Algorithms and Work

Interested in how algorithmic systems affect us at work? Here are some well-researched podcast episodes to get drawn into.

Help us fight injustice in hiring!

Donate your CV to fight together against automated discrimination in job application procedures!

Atlas of Automation

To detect benefit fraud, measure work performance, predict a person's creditworthiness or show us personalised content online - algorithms and so-called "artificial intelligence" are shaping our everyday lives today. Where, by whom and for what purpose these algorithmic systems are used, however, is largely a black box. With the Atlas of Automation, AlgorithmWatch CH now sheds light on the darkness.

A dollar for your face: Meet the people behind Machine Learning models

Wolt: Couriers’ feelings don’t always match the transparency report

In August, the Finnish delivery service Wolt published its first “algorithmic transparency report”. We asked three couriers about their experiences. They don't always match the report’s contents.

FINDHR hero picture

FINDHR

Fair algorithms in personnel selection?

AlgorithmWatch CH is part of the Horizon Europe project “FINDHR”. In this interdisciplinary research project, we address software-related discriminatory effects within recruiting processes by developing methods, tools, and trainings that are designed to avoid discrimination.

Analytics for the People?

What algorithms at the workplace mean for worker rights and participation

In a joint project with the trade union syndicom, AlgorithmWatch CH investigated how employees can be empowered when algorithmic systems are used in the workplace.

Digital Bouncers: AI in Recruiting

Automated decision-making systems are increasingly used by companies to decide who is best for a job. Applicants are worried about being rejected by a machine, based on programmed prejudices. In Switzerland, employers are especially reluctant to speak about the hiring algorithms that they use.