counter hit make

Meta Is Using Your Data to Train Its AI Chatbot, But You Can Stop It

0 23

Meta had planned to start collecting publicly available user data and information shared with its artificial intelligence (AI) chatbot Meta AI to train its AI models in the UK starting June 26. However, the Irish Data Protection Commission (DPC) told the social media giant to postpone its plans to give the agency enough time to address the privacy concerns of the users. While such data protection measures are available at an institutional level in the UK and Europe, the majority of the world does not have them.

For instance, Meta has been collecting its US users’ data across all of its platforms and beyond since last year. The tech giant also collects such data from users in a large number of countries including India. This means any data a user has made available publicly or shared with the Meta AI was likely already used to personalise ads and to train its Llama AI models.

According to Meta’s privacy policy, the platform collects a large amount of user data based on their activity on its apps and products. Some of these include any publicly available posts, comments, audio, photos and their caption, ads viewed or interacted with, apps and features used, hashtags used, purchases made on the app and transaction history, as well as time, frequency, and duration spent on a Meta-owned app. These apps include WhatsApp, Facebook, Instagram, and Meta AI across these platforms.

Some of the data, such as ads viewed, features used, and purchases made on the app are commonly collected by most apps to either improve the product, create a more convenient experience, or personalise ads. However, some of the data collected by Meta quickly delves into the morally grey area. For example, the social media giant also collects off-product data about the user.

In its privacy policy, the company stated, “We collect and receive information from partners, measurement vendors, marketing vendors and other third parties about a variety of your information and activities on and off our Products.” Some of this data includes device information, websites visited, apps used, purchases and transactions made off-product, demographic data, and more. In an older blog post, Meta had also explained its usage of user data for training its AI models.

With the recent action by the DPC, Meta will likely delay the collection of data to train its AI in the UK. In Europe, the European Union’s General Data Protection Regulation (GDPR) and other privacy-focused laws have resulted in unforeseeable delays in the launch of Meta AI in the region. However, the social media giant will continue to collect data for its usual practices.

For users living outside of the UK or the EU, not a lot can be done as the platform will collect users’ public data and interactions with the Meta AI. However, users can still take a couple of steps to minimise the data the company has access to.

First, users can ask Meta to delete any third-party and off-product data it has collected about them. To do this, they will have to fill out a form titled “Data subject rights for third-party information used for AI at Meta”. This form can be accessed here and can be filled out within a couple of minutes.

Second, those who use Instagram and Facebook can turn their account private to protect any future posts. On Facebook, users can go to Profile > Settings > Tools and resources > Privacy Checkup > Who can see what you share. Once there, users can manually change the privacy settings for different activities.

On Instagram, users can go to Profile > Settings and activity (three horizontal lines on the top-right corner) > Who can see your content > Account privacy > Private account. Toggle this on to turn your account private.

Despite these measures, the older public posts will still be accessed by Meta and there is nothing that can be done about them. Also, if a comment is made on a public post by a private profile, the company can still access it.

Leave A Reply

Your email address will not be published.