OpenAI: The New York Times is forcing us to turn over 20 million ChatGPT conversations
A judge in the The New York Times’ copyright lawsuit against OpenAI (and Microsoft) has ordered that the ChatGPT maker hand over the conversations of 20 million users to the Times’ lawyers, in an effort to find examples of copyright violations.
Today, OpenAI is lobbying the public in a last-ditch effort to prevent the release, which is due Friday:
“The New York Times is demanding that we turn over 20 million of your private ChatGPT conversations. They claim they might find examples of you using ChatGPT to try to get around their paywall. This demand disregards long-standing privacy protections, breaks with common-sense security practices, and would force us to turn over tens of millions of highly personal conversations from people who have no connection to the Times’ baseless lawsuit against OpenAI.”
If the company’s final appeals to the court do not succeed, OpenAI explains that it will de-identify the chat logs, scrub any personally identifying information from the chats, and that technical experts hired by The New York Times’ legal team will be the only ones who can examine the data, which will be tightly controlled.
Today, OpenAI is lobbying the public in a last-ditch effort to prevent the release, which is due Friday:
“The New York Times is demanding that we turn over 20 million of your private ChatGPT conversations. They claim they might find examples of you using ChatGPT to try to get around their paywall. This demand disregards long-standing privacy protections, breaks with common-sense security practices, and would force us to turn over tens of millions of highly personal conversations from people who have no connection to the Times’ baseless lawsuit against OpenAI.”
If the company’s final appeals to the court do not succeed, OpenAI explains that it will de-identify the chat logs, scrub any personally identifying information from the chats, and that technical experts hired by The New York Times’ legal team will be the only ones who can examine the data, which will be tightly controlled.