Report algorithmic discrimination!
Report a case
Report a case

Report algorithmic discrimination!

AlgorithmWatch CH wants to shine a light on where and how algorithmic discrimination can take place. Do you have reason to believe that algorithmic discrimination may have taken place? Then we ask you to report this to us to help us better understand the extent of the issue and the havoc algorithmic systems can wreak on our lives. Your hints can help us make algorithmic discrimination more visible and strengthen our advocacy for appropriate guardrails.

Want to learn more about this topic? You can read more about the causes of algorithmic discrimination and our demands for better protection from it here.

FAQs

What is algorithmic discrimination?

Institutions rely on automated decisions throughout our society. Algorithmic decision-making systems process tax returns, evaluate job applications, make recommendations, predict crimes or the chances of refugees being integrated into the labor market. Such systems are not neutral themselves. People with their particular assumptions and interests influence how the systems are developed and used. They reproduce patterns of discrimination that already exist in society. Some people can therefore be discriminated against when they have to deal with an algorithmic system. You can find more information on this here.

What can I do if I suspect that I was discriminated against by an algorithmic system?

Report the incident to us! We will help investigate it further. By reporting about such incidents, more and more people will come across this important issue and become aware of what they can do if they have been affected by algorithmic discrimination. With increased awareness, we can advocate for better safeguards and put pressure on decision makers to address the issue.

You can also contact anti-discrimination counselling offices to better assess whether discrimination has occurred. In some cases, it may also be useful to contact consumer protection organisations.

Not all cases of discrimination can be brought before a court. The Swiss Federal Constitution guarantees the principle of equality and prohibits discrimination. The Constitution guarantees equality between men and women and the right to equal pay for work of equal value. The Federal Constitution also stipulates measures to eliminate discrimination against people with disabilities.

There is no general anti-discrimination law in Switzerland, i.e. one that protects all groups affected by discrimination. There is also no general prohibition of discrimination in the private sector, but Swiss criminal law does have a provision against discrimination on the basis of race, ethnicity, religion or sexual orientation (Art. 261bis SCC).

Where could the use of automated decision-making systems lead to unjustified or unequal treatment?

Automated decision-making systems can be found almost everywhere nowadays. A creditworthiness inquiry, e.g., is usually automatically initiated when we request installment payments in an online shop. Increasingly, companies also employ algorithms for making personnel decisions and coordinating work processes, and authorities use algorithms both overtly and covertly – for instance, to generate tax assessments, decide on the allocation of daycare spots, in policing, in the justice system, and so on.

It is often when comparing the experience of two individuals engaging with the same system that unequal treatment becomes evident. If one person was offered a different price for a service than another person, or if a request was declined while a similar one was accepted, this might suggest that the automated risk assessment was conducted differently, such as an assessment concerning the likelihood of filing an insurance claim. This can, of course, also happen when a person, not an automated system, is responsible for assessing an application. It would be our job to investigate further whether this is happening on a large scale because of the use of an automated system.

Many discriminatory effects are not immediately evident. If, for example, the police use a "predictive policing" tool that automatically generates a patrol plan for police based on certain predictions and crime statistics, this can lead to certain neighborhoods being patrolled more frequently. In such a case, it is not apparent to outsiders that an automated system is the reason why the police are intensifying their presence in a specific area.

What information is necessary to be able to investigate my case?

Please share your experience with us and briefly explain in what ways you perceived the incident to be unjust or discriminatory. Feel free to also upload any files that support the points you’re making, e.g., photos of letters or screenshots of websites, forms, or web applications, or send us a short audio recording or video in which you explain the case instead.

Here is an example of how a message to us could roughly sound like: 

Hello AlgorithmWatch CH Team,

I recently completed an online form and my insurance company X provided a quote at a higher price than someone I know has to pay. I suspect that the automated pricing system associated my personal data and the characteristics of my profile with a higher risk of claims. I assume that my personal situation is factored in...

Please note: Algorithms may discriminate differently from humans and in ways that don’t necessarily need to make sense to us. Your personal data might, for example, be correlated in an unapparent way or fall into a very specific group category that might be discriminated against, such as "female dog owners above the age of 40 living in postal code area 00000." Therefore, it is most valuable to reflect on the information you provided in a specific situation, such as when filling out a form for a particular service or purchase. 

What can AlgorithmWatch CH do about my individual experience?

We operate as an advocacy and research organization. We conduct journalistic investigations, do academic research, and advocate for improved protection from algorithmic discrimination. This may include holding companies accountable, drawing the attention of the media and the public to problems and reminding politicians that they must comply with the protection of fundamental rights. Your firsthand experience will help us raise awareness and use evidence to exert pressure on those employing discriminatory systems and advocate for necessary regulation.

Who can I contact for personal consultations in cases of unjust treatment or discrimination in general?

AlgorithmWatch CH can advise you on incidents specifically concerning algorithmic discrimination and analyze what is behind the phenomenon. We follow cases when we receive indications that the use of AI systems is leading to discrimination. However, AlgorithmWatch CH is not an official advice organization and cannot provide legal advice. We therefore recommend that you always contact equality or counselling offices that offer legal advice. Following this FAQ section, we provide you with a list of contacts to equality bodies in Germany and the EU that you are advised to contact additionally and that can offer you consultation concerning cases of unjust treatment or discrimination. For persons located in Germany and EU, please see the information provided by our colleagues at AlgorithmWatch.

How will AlgorithmWatch CH handle my data?

Any personal data, such as your name, email address, or phone number, will solely be used to facilitate communication with you. You can report to us anonymously, use a pseudonym, or simply not provide any contact information if you do not wish to be contacted further by us. Kindly indicate your preference in the options provided in the form and inform us whether and/or to what extent we can use the information you share, taking into account the privacy of others. We therefore ask you not to pass on data from third parties to us. For additional details, refer to our privacy policy available here.

Can I report a potential case that happened anywhere in the world?

The AlgorithmWatch team based in Switzerland can assist you if you are located in Switzerland, while the team based in Germany handles cases from across the EU. If you are not located in either of these regions or are unsure about an incident you’d like to report, you can still get in touch with us. We may be able to direct you to colleagues who can provide you with expertise on your region and give assistance with your case. See our contact information at the bottom of this page.

Spread the word about our campaign and share it on

Would you like to stay informed about our efforts to combat algorithmic discrimination and other work by AlgorithmWatch? Then you can subscribe to our newsletter here.