OpenAI claims New York Times’ prompting strategy violates its terms of service


According to OpenAI, the New York Times’ prompts, which generate exact copies of the New York Times’ content, violate the terms of use of its language models.

According to the complaint, the New York Times prompted GPT models with the beginnings of its original articles, causing the model to complete the text as closely as possible.

The NYT used the beginnings of its articles as prompts to provoke completion. According to OpenAI, this violates its terms of service. | Image: Screenshot of the indictment

This prompt strategy increases the likelihood that the model will spit out original training data. It encourages copyright infringement, so to speak. In chats, as we know them from ChatGPT, such output would be much less likely or even impossible with regular prompts.

OpenAI says New York Times prompts violate its terms of service

According to Tom Rubin, OpenAI’s head of intellectual property and content, the New York Times intentionally used these manipulative prompts to deliberately reproduce training data.



The prompts used in the complaint are not examples of intended use or normal user behavior, Rubin said in an email to the Washington Post. The prompts would violate OpenAI’s terms of service.

In addition, many of the examples are no longer reproducible. OpenAI is constantly working to make its products more resistant to “this type of misuse,” Rubin said.

Is it the prompt or the result that counts?

The dispute between OpenAI and the New York Times may come down to whether memorizing individual training data from large language models is a bug or a feature.

Does the prompt that generates an output matter – or is only the output relevant, and is it copyright infringement if it closely resembles an existing, unlicensed work?

If output that closely resembles an original is already considered copyright infringement in court, the question of potential copyright infringement by training data or misquotations and reproductions is essentially moot.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top