Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
World News

This tool can mask digital artwork so the AI ​​can’t mimic its style


Thanks to the power of AI, it is now possible to reproduce distinctive artistic styles in minutes – an innovation that leaves traditional artists in the lurch as their art is used to train AI models that then steal job opportunities.

What if you could stop AI models from replicating your art style?

Researchers at the University of Chicago have created a tool they believe will do just that, a filter which, when applied to an image, means the image cannot be read and reproduced by AI tools. scratching line art.

Called “Glaze”, a beta version of the free tool launched for download last week.

AI art can be produced instantly, but only because AI pulls data from thousands of artworks on the internet that have taken human artists weeks or even months to create.

The creators claim that Glaze will allow artists to protect their distinct artistic style from being absorbed into the pool of data that AI art tools rely on.

“Artists really need this tool; the emotional and financial impact of this technology on them is very, very real,” said Ben Zhao, Neubauer Professor of Computer Science at the University of Chicago, in a February press release. “We spoke to teachers who saw students dropping out of their classrooms because they thought there was no hope for the industry, and to professional artists who saw their style ripped off left and right.”

The project involved surveying more than 1,100 professional artists, according to the release. The tool was tested on 195 historical artists, as well as four currently active artists, before a focus group assessed Glaze’s accuracy in disrupting AI imitation.

Over 90% of artists surveyed said they were willing to use the tool when posting their art.

Glaze is the second project from the University of Chicago’s SAND Lab that protects images published online. SAND Lab previously created a tool to protect personal photos so they couldn’t be used to train facial recognition software in 2020. But when they started applying the same concept to art, a few issues arose. are immediately set.

Photos of human faces may boil down to a few distinct characteristics, but the art is much more complex, with an artistic style defined by many things, including brushstrokes, color palettes, light and shadow. as well as texture and positioning.

In order to confuse the AI ​​tools and ensure that they would not be able to read the art style and reproduce it, the researchers had to isolate which parts of an artwork were highlighted as indicators. styling keys by AI art tools.

“We don’t need to change all of the image information to protect the artists, we only need to change the styling features,” said Shawn Shan, a UChicago computer science graduate student who co – writes the study, in the press release. “So we had to devise a way to essentially separate the stylistic characteristics of the image from the object, and only try to disrupt the stylistic characteristic using the cape.”

To do this, the researchers used a “fighting fire with fire” approach. Glaze works by using AI to identify the style characteristics that change when an image passes through a filter to transform it into a new art style, like cubism or watercolor, then taking those characteristics and adjusting them. just enough to fool other AI tools.

They target the “Achilles heel of AI models” which is “a phenomenon called conflicting examples – small adjustments in inputs that can produce massive differences in how AI models classify inputs”, according to the website.

Basically, Glaze very slightly alters these key elements on an artwork, while leaving the original art nearly identical to the naked eye, so other AI tools won’t be able to recognize, and therefore replicate, the individual style of original art.

“We let the model teach us which parts of an image most relate to the style, then we use that information to come back and attack the model and mislead it into recognizing a different style than what the art actually uses, Zhao said.

If an AI tool designed to replicate the style of artwork tries to replicate an artwork with Glaze on it, it will read that artwork as having a different style, such as Vincent Van Gogh’s art style , and will produce an imitation that uses that style instead.

Although many AI art tools have already had the chance to learn from thousands of unmasked images online, introducing more masked images online using Glaze will reduce their effectiveness by matter of imitation, according to the researchers.

To use it, artists can download Glaze to their computer and run it over images they hope to conceal from the AI. They can also customize the amount of mods Glaze introduces, with small mods seeming almost invisible but offering less protection against the AI, while larger mods can be more visible but offer much more protection.

“The majority of artists we spoke to had already taken action against these models,” Shan said. “They started deleting their art or only uploading low-res images, and those measures are bad for their careers because that’s how they get jobs. With Glaze, the more you mess up the image, the better is protection. And when we asked artists what they were comfortable with, a number chose the highest level. They are willing to tolerate great disruption because of the devastating consequences if their styles are stolen.

ctvnews Canada news

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button