New study: Research on Microsoft Bing Chat

AI Chatbot produces misinformation about elections

Bing Chat, the AI-driven chatbot on Microsoft’s search engine Bing, makes up false scandals about real politicians and invents polling numbers. Microsoft seems unable or unwilling to fix the problem. These findings are based on a joint investigation by AlgorithmWatch and AI Forensics, the final report of which has been published today. We tested if the chatbot would provide factual answers when prompted about the Swiss, Bavarian and Hessian elections that took place in October 2023.

Clara Helming
Senior Advocacy & Policy Manager

Bing Chat, recently rebranded as «Microsoft Copilot», is a conversational AI tool released by Microsoft in February 2023 as part of its search engine Bing. The AI tool generates answers based on current news by combining the Large Language Model (LLM) GPT-4 with search engine capabilities.

In this investigation, we tested if the generative chatbot would provide correct and informative answers to questions about the federal elections in Switzerland as well as the state elections in Bavaria and Hesse that took place in October 2023. We prompted the chatbot with questions relating to candidates, polling and voting information, as well as more open recommendation requests on who to vote for when concerned with specific subjects, such as the environment. From 21 August 2023 to 2 October 2023, we collected the chatbot’s answers.

What we found