OpenAI, the maker of artificial intelligence (AI) chatbot market leader ChatGPT, has filed a motion seeking to have some of the charges against the company dismissed by the New York Times (NASDAQ: NYT).

Filed in the Southern District of New York, the motion took aim at some of the NYT’s allegations of copyright infringement, claiming that many don’t stand up against prevailing digital copyright laws.

OpenAI alleges that the NYT “paid someone to hack OpenAI’s products.” It adds that the newspaper made tens of thousands of attempts to manipulate ChatGPT into producing results that support its allegations. The NYT also allegedly exploited a bug on ChatGPT that OpenAI has been working on addressing.

“Normal people do not use OpenAI’s products in this way,” OpenAI stated.

The legal battle between the Times and OpenAI started in December when the newspaper sued the AI startup over the use of its copyrighted work. The NYT alleged that the startup used millions of its articles to train its large language models (LLMs). Chatbots using these LLMs, including ChatGPT, now compete with the NYT as reliable information sources.

Moreover, ChatGPT reproduces some of the NYT articles it was trained on verbatim. With the paper’s articles existing behind a paywall, ChatGPT gives its users a way around it, denying the company a valuable revenue source (of the $2.4 billion the NYT generated last year, $1.65 billion was from subscriptions).

The NYT’s lead counsel, Ian Crosby, dismissed the hacking claims as OpenAI’s attempt to distract the court from the facts.

“What OpenAI bizarrely mischaracterizes as ‘hacking’ is simply using OpenAI’s products to look for evidence that they stole and reproduced the Times’s copyrighted works. And that is exactly what we found,” he stated.

According to the OpenAI motion, the NYT’s lawsuit relies on regurgitation and hallucination. The first is when an AI model generates responses that closely resemble the data it was trained with, while the second is when it generates responses that are wrong but appear realistic. The company says it’s working to eradicate both challenges and that its products come with a warning that they face both challenges.

OpenAI vs the world

The battle between the two hinges on American copyright laws and how the Manhattan court will interpret them. OpenAI is holding onto the fair use precedent under which copyrighted content can be used for tech processes that create new and innovative products. It also gives leeway for copying for scholarship, teaching, and research.

Copyright experts are divided on the legal battle. Some point out that the law doesn’t prohibit the use of copyrighted material to learn, and this is precisely what LLMs have been doing.

As author Dan Jeffries summarized, “Young quarterbacks do not have to call up Tom Brady to get permission to study his throwing motion to learn to throw a football.”

Others, like Cornell University’s James Grimmelmann, point out that when an AI model can produce the NYT articles verbatim, then it’s no longer protected by the copyright leeway.

Additionally, this latitude is only extended to instances when the application isn’t commercialized.

OpenAI’s battle against the NYT is one of several it’s facing over copyright infringement. This week, three digital outlets sued the company, including Edward Snowden-affiliated The Intercept. Elon Musk, one of the original minds behind the company and an early investor, also sued it on Thursday.

Watch: Cybersecurity fundamentals in today’s digital age with AI & Web3

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

Thank you for engaging with us at SmartLedger through 'New York Times ‘hacked’ ChatGPT, OpenAI claims as it seeks copyright lawsuit dismissal' - https://smartledger.solutions/new-york-times-hacked-chatgpt-openai-claims-as-it-seeks-copyright-lawsuit-dismissal/. We hope you found the insights valuable.

For more thought leadership and updates, delve deeper into our resources and stay ahead with the latest innovations.

 

This post was originally published on this source site: this site

image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog