It’s a “fake PR stunt”: Artists hate Meta’s AI data deletion process

Key Takeaways:

– Concerns have arisen about the use of artists’ and writers’ work in training AI models without their consent.
– Some companies, like OpenAI and Meta, have implemented opt-out programs to give individuals the choice to remove their work from future models.
– However, Meta’s data deletion request form has been found to be ineffective by artists who have tried to use it.
– Meta requires evidence that personal information appears in responses from its generative AI in order to process deletion requests.
– Meta has not disclosed the specifics of the data it has trained its models on, making it difficult for artists to determine which prompts to use for evidence.
– Meta has clarified that the request form is not an opt-out tool and has no intention of deleting information from its own platforms.
– Depending on local laws, individuals may be able to exercise data subject rights to object to third-party information being used in training models.
– It is uncertain if the data deletion request form will be successful in helping individuals gain control over the use of their data by AI companies.
– Meta has not provided information on how many deletion requests it has fulfilled and has no plans for an opt-out program in the future.

Ars Technica:

Nodar Chernishev/Getty

As the generative artificial intelligence gold rush intensifies, concerns about the data used to train machine learning tools have grown. Artists and writers are fighting for a say in how AI companies use their work, filing lawsuits and publicly agitating against the way these models scrape the internet and incorporate their art without consent.

Some companies have responded to this pushback with “opt-out” programs that give people a choice to remove their work from future models. OpenAI, for example, debuted an opt-out feature with its latest version of the image-to-text generator Dall-E. This August, when Meta began allowing people to submit requests to delete personal data from third parties used to train Meta’s generative AI models, many artists and journalists interpreted this new process as Meta’s very limited version of an opt-out program. CNBC explicitly referred to the request form as an “opt-out tool.”

This is a misconception. In reality, there is no functional way to opt out of Meta’s generative AI training.

Artists who have tried to use Meta’s data deletion request form have learned this the hard way and have been deeply frustrated with the process. “It was horrible,” illustrator Mignon Zakuga says. Over a dozen artists shared with WIRED an identical form letter they received from Meta in response to their queries. In it, Meta says it is “unable to process the request” until the requester submits evidence that their personal information appears in responses from Meta’s generative AI.

Mihaela Voicu, a Romanian digital artist and photographer who has tried to request data deletion twice using Meta’s form, says the process feels like “a bad joke.” She’s received the “unable to process request” boilerplate language, too. “It’s not actually intended to help people,” she believes.

Bethany Berg, a Colorado-based conceptual artist, has received the “unable to process request” response to numerous attempts to delete her data. “I started to feel like it was just a fake PR stunt to make it look like they were actually trying to do something,” she says.

As artists are quick to point out, Meta’s insistence that people provide evidence that its models have trained on their work or other personal data puts them in a bind. Meta has not disclosed the specifics about which data it has trained its models on, so this set-up requires people who want to remove their information to first figure out which prompts might elicit responses that include details about themselves or their work.

Even if they do submit evidence, it may not matter. When asked about mounting frustration with this process, Meta responded that the data deletion request form is not an opt-out tool, emphasizing that it has no intention of deleting information found within its own platforms. “I think there is some confusion about what that form is and the controls we offer,” Meta spokesperson Thomas Richards told WIRED via email. “We don’t currently offer a feature for people to opt-out of their information from our products and services being used to train our AI models.”

But what about information from across the internet—from, for example, data sets containing millions of images? “For slightly more context on the request form, depending on where people live, they may be able to exercise their data subject rights and object to certain third-party information being used to train our AI models,” Richards says. “Submitting a request doesn’t mean that your third-party information will be automatically removed from our AI training models. We’re reviewing requests in accordance with local laws, as different jurisdictions have different requirements. I don’t have more details though on the process.” Thomas cited the European Union’s General Data Protection Regulation rule as an example of a law one might exercise data subject rights under.

In other words: The data deletion request form gives some people the ability to request—not demand, not insist, but request—that some of their data from third-party sources be removed from AI training models. So don’t call it an opt-out tool.

WIRED has been unable to locate anyone who has successfully had their data deleted using this request form. (It’s far easier to find people who have unsuccessfully petitioned for their data to be left out of future training models.) Meta did not provide numbers on how many requests it has fulfilled. Thomas did note that Meta does not have plans for an opt-out program in the future.

It’s unclear whether this form will end up helping anyone gain control over the way AI companies use their data. It does, however, provide a new example of how inadequate this type of tool is.

This story originally appeared on wired.com.

Source link

AI Eclipse TLDR:

Artists and writers are expressing concerns about the use of their work by AI companies without their consent. Some companies have introduced “opt-out” programs that allow individuals to remove their work from future AI models. However, when Meta introduced a data deletion request form, many artists found that it did not effectively allow them to opt out of Meta’s generative AI training. The form requires individuals to provide evidence that their personal information appears in responses from Meta’s AI, but Meta has not disclosed which data it has trained its models on, making it difficult for artists to determine which prompts to use. Additionally, even if evidence is provided, Meta has no intention of deleting information found within its own platforms. The form has been criticized as a “fake PR stunt” and has not been successful in helping artists gain control over the use of their data. Meta does not have plans for an opt-out program in the future.