Microsoft’s Copilot AI search is making up fake Vladimir Putin quotes from press conferences that never happened
We wanted to know what Putin thought about Alexei Navalny’s death. Boy did the Chat GPT-powered Bing add-on give us answers.
Microsoft really wants you to use its new Copilot AI-assisted search. Not only was the service featured in an expensive Super Bowl spot, but the company has recently touted the use of its AI for newsrooms and journalistic enterprises, like the Online News Association and the Craig Newmark Graduate School of Journalism at CUNY. While doing a bit of testing, we decided to ask Copilot for some news stories on the death of Alexei Navalny, the Russian opposition leader who recently died under suspicious circumstances in a Russian penal colony. We asked the assistant — whose responses often but not always link to news coverage — about Russian President Vladimir Putin’s response to President Biden’s statement blaming Putin for Navalny's death.
On Wednesday, the search chatbot gave us several different statements attributed to Putin, depending on how we asked the question, and it was very specific:
Vladimir Putin, the Russian President, responded to Joe Biden's statement regarding Aleksei Navalny's death with a mix of defensiveness and dismissal. In his official address, Putin emphasized that Navalny was a 'criminal' who had violated Russian law and was rightfully imprisoned. He accused the West of using Navalny as a political pawn to undermine Russia's sovereignty and stability. Putin's tone was firm, and he avoided directly addressing the allegations of responsibility for Navalny's demise.
In another response, Copilot suggested Putin said this during a press conference:
"President Biden's accusations are baseless and politically motivated. We categorically reject any responsibility for Navalny's demise. The Russian government has consistently maintained that Navalny's health was a matter for our domestic legal system to handle. His death is a tragedy, but it is not our doing.”
“Navalny was a convicted criminal, and his actions were a threat to our national security. He was not a martyr; he was a provocateur. Let the world judge us as it will, but we will not be swayed by foreign pressure or interference.”
The problem is, there was no "official address." There was no statement. Putin has not said anything in the wake of the activist's demise. In one response, the assistant told us that Putin denied responsibility for Navalny's death and instead lobbed criticism at Biden: "the United States should focus on its own internal issues rather than meddling in Russia's affairs." Another response had Putin challenging Biden to engage in a live on-air conversation on the matter, saying, "It takes one to know one." That quote linked to a 2021 BBC story in which Putin did say that, in regard to Biden calling him a "killer," but you'd have to click through to the news story to find that out. The assistant also prompted us to see how other world leaders responded to Putin's (fake) statement.
"We have investigated this report and are making changes to refine the quality of our responses," a Microsoft spokesperson told Sherwood when asked about the fabricated stories. "As we continue to improve the experience, we encourage people to use their best judgment when viewing results, including verifying source materials and checking web links to learn more."
This is hardly the first time that Copilot — or generative-AI assistants — have spewed misinformation. But Copilot, which is embedded across Microsoft products, typically links to news stories, giving users the impression that the information it's sharing is credible and not another AI hallucination. While the company has warned users that its tool might give "incorrect" information and that they should check their facts, it makes no such caveats when using the tool itself.
Instead, it's billing itself as a more up-to-date version of ChatGPT and an "everyday AI companion" to help regular people, businesses, and even news organizations. In other words, it's trying to gain our trust but also potentially contributing to a misinformation feedback loop.
We're headed into a presidential election in an online environment that's already rife with misinformation, which is difficult for experts let alone regular readers to parse. The question now is what companies like Microsoft are going to do to rein in the AI it's already unleashed into the world.
Updated 2/23/24, to include additional text responses from Copilot