If the UN wants to help humanity, it should not fall for AI hype

How should the international governance of AI look like? This is the thorny question the UN Secretary General’s AI Advisory Body tries to address in its first interim report. We have highlighted some concerning aspects of the report in a recent consultation process.

We at AlgorithmWatch welcome the UN’s initiative to advance recommendations for an international governance framework of AI. Why does the initiative matter? The UN with its 193 member states is the biggest international organization globally. UN guidelines – even if not legally binding – can serve as a compass for national and international policy-making.

Our main concern regarding the interim report is that it falls for the AI hype. It suggests – without any evidence – that AI is a solution to many of humanity’s serious problems, such as climate change, poverty, or hunger. Not only does this view dismiss the factors that have contributed to these global issues and keep feeding them, such as colonialist exploitation, power asymmetries, and wars. It also subscribes to the delusion that technological innovation will inevitably lead to a universal improvement in society.

We are also concerned that the report is advocating for the adoption of a new technology without emphasizing the need to ensure that it won't worsen existing issues or create new problems. If we decide to tackle climate change with AI-enabled systems, for instance, we need to ensure that the development and use of AI systems – which often consume large amounts of energy and carbon emissions – respects planetary boundaries.

If we want to make use of algorithms’ and AI’s potential – and if we want to ensure that everyone benefits from it, not only just a few – then we have a responsibility to tackle challenges in a serious and evidence-based manner.

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.