Results for ""
The Guardian has prevented OpenAI from using its content to power artificial intelligence products like ChatGPT. Concerns that OpenAI uses unlicensed content to create AI tools have led to writers bringing lawsuits against the company. The creative industries are calling for safeguards to protect their intellectual property.
Generative AI technology – the term for products that generate convincing text, image and audio from simple human prompts – has amazed the public since a breakthrough version of its ChatGPT chatbot launched last year. However, fears have arisen about the chances for mass production of disinformation and how such tools are built.
The technology behind ChatGPT and similar tools is “trained” by being fed vast amounts of data culled from the open internet, including news articles, which enable the tools to predict the likeliest word or sentence to come after the user’s prompt.
OpenAI, which does not disclose the data that helped build the model behind ChatGPT, announced in August that it will enable website operators to block its web crawler from accessing their content. However, the move prevents material from being removed from existing training datasets. A number of publishers and websites are now blocking the GPTBot crawler.
According to a spokesperson for Guardian News & Media, scraping intellectual property from the Guardian’s website for commercial purposes has always been contrary to our terms of service. The Guardian’s commercial licensing team has many mutually beneficial commercial relationships with developers worldwide and looks forward to further building such relationships.
According to Originality.ai, which detects AI-generated content, news websites now blocking the GPTBot crawler, which takes data from webpages to feed into its AI models, including CNN, Reuters, the Washington Post, Bloomberg, the New York Times and its sports site the Athletic. Other sites blocking GPTBot include Lonely Planet, Amazon, job listings, Indeed, question-and-answer, Quora, and dictionary.com.