Microsoft Copilot Chat exposes confidential emails of users: Report
Microsoft 365 Copilot Chat reportedly allowed surfacing of confidential emails from users' Drafts and Sent folders due to a configuration error
)
Microsoft Copilot
Listen to This Article
Microsoft has reportedly admitted to a flaw in its AI-powered workplace assistant, Copilot Chat, that led to some users’ confidential emails being accessed and summarised unintentionally. According to a report by the BBC, in this issue, Copilot Chat became capable of pulling content from emails stored in a user’s Drafts and Sent Items folders, including messages that were labelled as confidential. For the uninitiated, Microsoft 365 Copilot Chat is the company’s generative AI tool integrated into apps such as Outlook and Teams.
Microsoft typically positions Copilot as a secure, enterprise-ready assistant designed to help employees summarise emails, draft responses and retrieve information from within their organisation’s systems.
What went down
The problem was initially highlighted by tech publication Bleeping Computer, which reported seeing a service alert referencing the issue. According to details cited in that report, Copilot Chat had been incorrectly processing emails marked with sensitivity labels, despite data loss prevention policies being configured to restrict such content.
Also Read
Reports suggest Microsoft became aware of the issue in January. A related notice also appeared on an NHS England IT support dashboard, attributing the root cause to a code-related error. While the notice implied NHS systems were affected, the BBC reported that the organisation said that any processed draft or sent emails remained accessible only to their original authors and that patient data had not been exposed.
Microsoft identifying and addressing the issue
BBC cited Microsoft as saying it had “identified and addressed” the problem. As per the report, the company clarified that while its underlying access controls and data protection policies remained in place, the behaviour did not align with how Copilot is supposed to function. It added that the assistant is designed to exclude protected content from its responses, even if the user technically has permission to view it.
Microsoft has since reportedly deployed a configuration update for enterprise customers worldwide. The company also stressed that the bug did not grant users access to information beyond what they were already authorised to see.
More From This Section
Don't miss the most important news and views of the day. Get them on our Telegram channel
First Published: Feb 20 2026 | 4:38 PM IST