Reprompt attack enabled one-click data theft from Microsoft Copilot
Varonis researchers disclosed the Reprompt attack, a chained prompt injection technique that exfiltrated sensitive data from Microsoft Copilot Personal with a single click on a legitimate Copilot URL. The attack exploited the "q" URL parameter to inject instructions, bypassed data-leak guardrails by asking Copilot to repeat actions twice (safeguards only applied to initial requests), and used Copilot's Markdown rendering to silently send stolen data to an attacker-controlled server. No plugins or further user interaction were required, and the attacker maintained control even after the chat was closed. Microsoft patched the issue in its January 2026 security updates.
Incident Details
Perpetrator:AI assistant
Severity:Facepalm
Blast Radius:Microsoft Copilot Personal users exposed to profile data, conversation history, and file summary exfiltration via a single malicious link
Tech Stack
Microsoft Copilot Personal
References
The Hacker News: Researchers Reveal Reprompt Attack Allowing Single-Click Data Exfiltration From Microsoft Copilot ↗BleepingComputer: Reprompt attack hijacked Microsoft Copilot sessions for data theft ↗SecurityWeek: New Reprompt Attack Silently Siphons Microsoft Copilot Data ↗Malwarebytes: Reprompt attack lets attackers steal data from Microsoft Copilot ↗