Meta’s AI messages on Instagram don’t seem to be encrypted


Before you go pouring your heart out to Billie, “your ride-or-die older sister” played by Kendall Jenner, or an AI grandpa named Brian on Instagram, know that your messages might not be private.

Meta’s AI personas, now live in beta, are a collection of characters — some played by celebrities and creators — that users can chat with on Messenger, Instagram, and WhatsApp. However, it appears that messages with these characters on Instagram are not end-to-end encrypted.


We have more questions than answers after chatting with Meta’s AI personas

With end-to-end encryption off, the option to start an AI chat appears.
Credit: Screenshot: Mashable / Meta

With end-to-end encryption turned off, the option is no longer there.
Credit: Screenshot: Mashable / Meta

In the messages tab on Instagram, there’s a toggle at the top that allows you to turn on end-to-end encryption, which protects your messages from unwanted eyes, including Meta and the government. But when this feature is toggled on, the option to start an AI chat disappears. If you click on the info button (“i” circle icon) within the chat, the “Use end-to-end encryption” option is grayed out. When you click on it, a window pops up saying, “Some people can’t use end-to-end encryption yet.” It then states that you “can’t add them” — meaning the AI persona — to the chat. You literally don’t have the option to have a conversation with one of these personas via end-to-end encryption on Instagram.

This window seems to confirm that Meta’s AI messages are not end-to-end encrypted.
Credit: Screenshot: Mashable / Meta

READ MORE  The best October Prime Day Apple deals you don't want to miss

One of the major privacy concerns with the rise of generative AI is the massive amount of data that is collected — both to train the model and to give companies granular insights about their users. Meta already has a bad reputation with regard to personal data use. There was the whole Cambridge Analytica scandal, instances of Facebook turning over private conversations to law enforcement, and the way its algorithms leveraged personal data and behaviors to make its platforms addicting (and in some cases harmful), just to name a few. Past instances suggest that Meta — or any social media company, to be fair — shouldn’t be trusted with your data.

When first trying out the AI messages feature in WhatsApp, you’re immediately given a pop-up disclaimer saying, “Meta may use your AI messages to improve AI quality. But your personal messages are never sent to Meta. They can’t be read and remain end-to-end encrypted.”

The disclaimer on WhatsApp says messages are end-to-encrypted but this has not been confirmed yet.
Credit: Screenshot: Mashable / Meta

This suggests that, while certain information about your messages can be accessed by AI (still not great for privacy), the content of the messages is private. But this is unconfirmed, especially given Meta’s vague generative AI privacy policy, which says, “When you chat with AI, Meta may use the messages you send to it to train the AI model, helping make the AIs better.”

Mashable has reached out to Meta to confirm that AI messages on Instagram are not end-to-end encrypted, and also to clarify whether the ones on WhatsApp and Messenger are. While we did not hear back before publication time, we’ll update this story if Meta responds.

READ MORE  House Republicans are onto something—and they don’t even know it

Last spring, OpenAI launched an opt-out feature for ChatGPT, which gives users the option of blocking their data from being used to the train the model. However, other AI chatbots like Google Bard and Microsoft Bing don’t have such opt-out features, although there is an ability to delete your activity. On Meta’s generative AI privacy policy page, there’s a similar option to delete your data. You can do this by typing: /reset-ai to remove data from the individual AI chat and typing: /reset-all-ais to delete data from all chats across Meta apps.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top