Thursday 11 May 2023

Artist sues AI generators for allegedly using work to train image bots


AI-generated images that mimic an artist’s style is a form of identity theft and compete with the very creatives whose work was used to train the models, a fine artist suing two artificial intelligence firms told Fox News.

AI platforms like Midjourney and Stable Diffusion use text and images from across the internet and other sources to train their machines to create images for their consumers.

San Francisco-based artist and illustrator Karla Ortiz, who claims her artwork was used to train the tech, filed a lawsuit in January against both companies for copyright infringement and right of publicity violations.

“Somebody is able to mimic my work because a company let them,” Ortiz told Fox News. “It feels like some sort of industrial-level identity theft.”

“It feels like someone has taken everything that you’ve worked for and allowed someone else to do whatever they want with it for profit,” she said. 

Ortiz said that before she filed suit, she could prompt Midjourney and Stable Diffusion to create imagery “in the style of Karla Ortiz” and the AI platforms would follow suit.

Stability AI, Stable Diffusion’s creator, filed a motion in April to dismiss Ortiz’s case, claiming the artist failed “to identify a single allegedly infringing output image, let alone one that is substantially similar to any of their copyrighted works.” Midjourney filed a similar motion the same day.


This photo illustration shows an artificial intelligence manga artist wearing gloves to protect their identity.
AFP via Getty Images

“For these models to generate the imagery that you see today, or anything for that matter, they have to be first trained on massive amounts of data, data that includes image and text,” Ortiz told Fox News. “That data, it includes everything.”

“It includes people’s medical records, it includes people’s businesses, housing, in some cases people’s likenesses, and in our case as well, pretty much all of my entire artwork and specifically my fine art,” Ortiz continued.

Other artists have similarly scrutinized tech companies’ methods for training their models and the potential for data exploitation in creating machine learning.

Still, artists, including musicians, illustrators and writers, can’t copyright their style, an attorney told Fox News last month.

After she filed her suit, Ortiz said Midjourney and Stable Diffusion stopped using data pulled from her art to create images.


The website of Midjourney, an artificial intelligence tool capable of creating AI art, is seen on a smartphone.
The website of Midjourney, an artificial intelligence tool capable of creating AI art, is seen on a smartphone.
Photothek via Getty Images

But her concerns still remain. 

“It generates imagery that is meant to look like yours and potentially even compete in your own market, utilizing your own name and your own work,” Ortiz said. “You are competing with a digital copy of yourself, with a machine that does not sleep, does not rest and does not get paid.” 

AI could impact up to 300 million jobs worldwide, according to a March report from Goldman Sachs. And IBM recently announced it would pause looking for candidates that AI could replace, with CEO Arvind Krishna predicting that up to 30% of non-customer-facing roles — or nearly 8,000 jobs — could be eliminated in the next five years.

“It’s not going to be just painters and illustrators and voice actors and musicians,” Ortiz told Fox News. “This is coming for almost every white-collar job you can imagine.”

“Andagain, it’s all done with our data,” she said. “It’s all done with our work.”

Neither Midjourney nor Stability AI returned requests for comment.



Source link

source https://1steconomic.com/artist-sues-ai-generators-for-allegedly-using-work-to-train-image-bots/

No comments:

Post a Comment

Madonna postpones tour due to serious bacterial infection : NPR

The performer Madonna, onstage at the 65th Grammy Awards ceremony in Los Angeles in February. Frazer Harrison/Getty Images hide ca...