Microsoft’s Latest Bug Exposes Confidential Emails to Copilot AI
In a significant privacy concern, Microsoft has disclosed a software vulnerability that enabled its Copilot AI to access and summarize sensitive customer emails without authorization for several weeks. This issue, first highlighted by Bleeping Computer, has been ongoing since January, raising alarms about data protection measures.
Copilot Chat, an AI-driven feature integrated into Microsoft 365 applications like Word, Excel, and PowerPoint, was impacted by this flaw, designated as CW1226324. Despite customers implementing data loss prevention (DLP) policies intended to safeguard their information, draft and sent emails marked with confidential labels were incorrectly processed by the AI.
Microsoft began deploying a patch to address this issue earlier in February. However, the tech giant has yet to provide details on the number of customers affected by the breach, with a representative not responding to inquiries for clarification.
As a precautionary measure, the IT department of the European Parliament announced it has disabled built-in AI features on employee devices, citing concerns about the potential for these tools to inadvertently upload sensitive correspondence to cloud servers.
Key Highlights:
– Vulnerability allowed unauthorized access to confidential customer emails from January.
– Copilot Chat is part of Microsoft 365, affecting applications like Word and PowerPoint.
– Microsoft is actively distributing a fix but has not disclosed the extent of the issue’s impact.
– European Parliament has disabled AI features due to security concerns.
