
Project “FINDHR”
How to Reduce Discrimination when Using Algorithms in Recruitment
AI that pre-sorts job applications? As part of the Horizon Europe project “FINDHR,” we developed algorithms, methods, and training to reduce discrimination when AI is used in recruitment. Here are our findings.

Project completed
Project duration: November 2022 to January 2026
How can you select the right candidates from a large pool of applicants? Job application and recruitment processes are challenging and time-consuming – so the desire to automate part of the work is understandable. For example, algorithmic or AI-based systems enable recruiters to pre-sort resumes or to rank applicants. Such systems can, however, also reproduce discriminatory patterns – with this often going unnoticed – and making it more difficult for candidates to access the labor market.
In the project “FINDHR – Fairness and Intersectional Non-Discrimination in Human Recommendation”, we worked with an interdisciplinary European consortium from academia, industry, and civil society to develop solutions that counteract this type of discrimination.
Algorithmic discrimination in recruiting
What does it mean?
The use of AI-based recruitment systems promises time savings and efficiency gains for HR managers – while still enabling them to find the best candidates. Previous experience shows, however, that the use of such tools reproduces patterns of discrimination and even exacerbates discriminatory barriers in the labor market. Intersectional discrimination, where the combination of several personal characteristics (such as gender, age, religion, origin, or sexual orientation) creates new forms of discrimination or multiplies existing forms of discrimination, poses a particular challenge.
One of the best-known examples of algorithmic discrimination in hiring processes is a system that Amazon is said to have developed. Even in the testing phase, reports indicated that its recommendation algorithm discriminated against women by filtering out their resumes. Amazon stated that the project was terminated before the software was put into use; attempts to mitigate the disadvantaging of women did not seem to have been successful.
How are people affected?
Algorithmic discrimination in hiring is not a theoretical construct, but rather a reality for many people. As part of FINDHR, we discussed this with people from underrepresented groups in seven European countries (Albania, Bulgaria, Germany, Greece, the Netherlands, Italy, Serbia). Many reported feelings of powerlessness and frustration, having received only automated rejections – often outside of normal working hours – despite their qualifications, numerous applications and efforts, making it unlikely that their application had ever been reviewed by a human being. Some adjust their CVs, and thus also a part of their identity, trying to better align it with the algorithm: they change names or their spelling so that they sound “more Western” or are easier to read. Some retouch their photos to appear older or younger, or downplay their professional experience so they are not filtered out as overqualified or “too old.” Discrimination in the labor market – by algorithms or by human beings – pushes job seekers into precarious circumstances.
Solutions that counteract discrimination
What do you need to consider when developing and using AI hiring systems to reduce the risk of discrimination? We spent three years developing tools, guidelines and training courses within FINDHR to reduce algorithmic discrimination in hiring procedures. These are now publicly and freely available.
- FINDHR Toolkits with concrete recommendations for software developers, HR professionals, and policymakers
- Guidelines and methods for inclusive software design and for the responsible use, auditing and monitoring of algorithmic recruiting systems.
- Technical tools and software to reduce the risk of algorithmic discrimination in AI-hiring systems.
- Training programs for professionals to build awareness about the risks of algorithmic discrimination in hiring.
- Experiences from those affected and tips for job seekers to draw attention to the often invisible barriers in jobsearch.
FINDHR Toolkits – just hiring!
To effectively address algorithmic discrimination by AI hiring systems, we need an interdisciplinary approach and various stakeholders to be actively engaged. That's why we condensed FINDHR’s key results and findings into three target group specific toolkits. These contain well-founded background information as well as specific recommendations on how to take action to counteract algorihmic discrimination in recruitment. The Toolkits are available as a web version as well as PDFs.
Guidelines and methods
FINDHR firmly believes that discrimination does not arise exclusively at the technical level and cannot solely be tackled there: the social and cultural context within which a system is developed and implemented must also be taken into account. Our legal analysis further clarifies the tensions between data protection regulations and anti-discrimination regulations in Europe – and indicates that there is also a clear need for action at the political level.
In short: Algorithmic discrimination in recruitment requires an interdisciplinary approach and must be actively addressed throughout the entire life cycle of a system. The following three FINDHR guideline documents contain recommendations and methods to do so.
The Software Development Guide helps developers design, implement, and maintain new recruitment systems that are fair and inclusive.
The Impact Assessment and Auditing Framework demonstrates how algorithmic systems can be audited in a fair, legally secure, and ethical manner to identify and reduce discriminatory patterns at an early stage.

Technical tools and software
Within FINDHR, we not only developed recommendations but also concrete technical solutions to prevent discrimination through recruitment systems.


Anti-discriminatory trainings for professionals
We have developed “Anti-discrimination training for algorithmic hiring” aimed at HR managers, software developers, scientists, and other professionals who want to learn how algorithmic discrimination occurs in hiring processes and what they can do to reduce it.
Experiences from those affected and tips for job seekers
Discrimination can only be effectively reduced if those affected are actively heard and involved. Our FINDHR reports provide insights into real experiences of discrimination by algorithmic hiring practices.
Based on insights from the FINDHR focus groups, we developed a practical manual for jobseekers. It offers resources and tips to help optimize job applications and CVs for algorithmic recruitment processes.
This partner are involved in the project "FINDHR":
- Universitat Pompeu Fabra
- Universiteit van Amsterdam
- Universitá di Pisa
- Max-Planck-Institut für Sicherheit und Privatsphäre
- Radboud Universiteit
- Universiteit Utrecht
- Women in Development Europe+
- Praksis Association
- Eticas Research and Consulting
- Randstadt
- Adevinta
- AlgorithmWatch CH


This article is part of a project that has received funding from the European Union's Horizon Europe research and innovation program under grant agreement No 101070212. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union. Neither the European Union nor the granting authority can be held responsible for them.
This work is supported by the Swiss State Secretariat for Education, Research and Innovation (SERI) under contract number 22.00151.


