Meta’s removal of end-to-end encryption from Instagram direct messages, effective May 8, 2026, raises fundamental questions about the responsibility that platforms bear toward their users. The change was disclosed through a quiet help page update. Understanding the nature and limits of platform responsibility is essential in the wake of this decision.
Encryption on Instagram was introduced in 2023 as an opt-in feature following Zuckerberg’s 2019 commitment. The feature represented one dimension of a platform’s responsibility to its users: to provide tools that enable private communication. Its removal eliminates that dimension without replacing it with anything equivalent.
After May 8, Meta will have access to all Instagram DMs. The company’s responsibility now shifts: it must demonstrate that this data is handled with the care that users’ private communications deserve. Meta has not yet made clear commitments about how it will exercise this responsibility.
Law enforcement agencies including the FBI, Interpol, and national bodies in Australia and the UK had pushed for this change. Child safety advocates backed their position. Australia’s eSafety commissioner noted that platforms bear responsibility for safety regardless of their encryption choices. Australia reportedly saw the feature deactivated before the global deadline.
Digital Rights Watch argued that platform responsibility must extend beyond safety to include privacy, transparency, and user consent. Tom Sulston maintained that Meta’s responsibility to its users does not end with the removal of encryption. The company must now be held accountable for how it handles the private data it has gained access to.