Microsoft Copilot Studio had security issues that could have allowed threat actors to extract sensitive data from vulnerable endpoints, experts have warned.
Cybersecurity researcher Evan Grant of Tenable, who found and reported the vulnerability, which is described as an information disclosure flaw resulting from a server-side request forgery (SSRF) attack, and tracked as CVE-2024-38206 with a severity score of 8.5.
Copilot Studio is an end-to-end conversational AI platform that allows users to create and customize copilots using natural language or a graphical interface.
Microsoft fixes the bug
Describing the flaw, Grant said it abuses a Copilot feature in which it makes external web requests.
“Combined with a useful SSRF protection bypass, we used this flaw to gain access to Microsoft's internal infrastructure for Copilot Studio, including the Instance Metadata Service (IMDS) and internal Cosmos DB instances,” Grant said.
In simple terms, Grant extracted instance metadata from Copilot chat messages and used it to obtain access tokens to managed identities. These, in turn, allowed him to access other internal resources as well as read and write capabilities on a Cosmos DB instance.
“An authenticated attacker may be able to bypass server-side request forgery (SSRF) protection in Microsoft Copilot Studio to exfiltrate sensitive information over a network,” Microsoft said in an advisory, acknowledging the bug. Users don't need to do anything, but Microsoft is taking care of fixing the issue.
While the flaw allows criminals to access sensitive data, it does not allow them to access information across multiple tenants, Grant concluded. Still, since Copilot Studio's infrastructure is shared across multiple tenants, it theoretically means multiple customers can be affected by having elevated access to Microsoft's infrastructure.
Microsoft Copilot Studio is part of Microsoft’s Copilot initiative, which integrates AI-powered tools into its software suite. Announced in 2023, Copilot Studio enables organizations and developers to tailor Copilot’s behavior to their specific needs.
Through Hacker News