Microsoft has confirmed {that a} bug allowed its Copilot AI to summarize clients’ confidential emails for weeks with out permission.
The bug, first reported by Bleeping Pc, allowed Copilot Chat to learn and description the contents of emails since January, even when clients had information loss prevention insurance policies to forestall ingesting their delicate info into Microsoft’s massive language mannequin.
Copilot Chat permits paying Microsoft 365 clients to make use of the AI-powered chat characteristic in its Workplace software program merchandise, together with Phrase, Excel, and PowerPoint.
Microsoft mentioned the bug, trackable by admins as CW1226324, implies that draft and despatched e-mail messages “with a confidential label utilized are being incorrectly processed by Microsoft 365 Copilot chat.”
The tech big mentioned it started rolling out a repair for the bug earlier in February. A spokesperson for Microsoft didn’t reply to a request for remark, together with a query about what number of clients are affected by the bug.
Earlier this week, the European Parliament’s IT division instructed lawmakers that it blocked the built-in AI options on their work-issued units, citing considerations that the AI instruments might add doubtlessly confidential correspondence to the cloud.



