Meta now admits that all data uploaded to Facebook and Instagram since 2007 has been used to train the company’s AI models. This applies to all countries except Brazil and within the EU.
During an investigation in Australia, Senator Tony Sheldon of the Australian Labor Party questioned Meta’s Global Privacy Director Melinda Claybaugh about whether the company used user data to build its AI. Initially, Claybaugh denied it. But Australian Greens Senator David Shoebridge challenged Meta’s claim.
– The truth of the matter is that unless you have consciously set those posts to private since 2007, Meta has just decided that you will scrape all of the photos and all of the texts from every public post on Instagram or Facebook since 2007, unless there was a conscious decision to set them on private, he claimed, according to ABC News. That’s the reality, isn’t it?
– Correct, Claybaugh admitted, but added that no content from children under the age of 18 was used. However, she could not answer whether the company used data from previous years from users who are now adults, but who were under 18 when they created their accounts.
This means that all images, posts and comments published publicly on Meta’s platforms since 2007 were used to train its AI models.
Tougher laws in the EU
In June, Meta announced that user data would be used to train AI models, but that users in the EU could opt out due to uncertainty around EU privacy laws.
– In Europe there is an ongoing legal question around what is the interpretation of existing privacy law with respect to AI training, Claybaugh said, and continued:
– “We have paused launching our AI products in Europe while there is a lack of certainty.
However, users in other parts of the world do not have that choice, except in Brazil, which recently banned Meta from using user data to train AI, according to The Verge.
Claybaugh justifies Meta’s use of data by saying that the company wants to be able to provide the most “flexible and powerful” AI tool, and that a lot of data was needed to provide a safer product with less bias.