Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Microsoft has taken legal action against a group it claims is deliberately developing and using tools to bypass the security bars of the company’s cloud AI products.
according to complaint filed by the company In December, a group of 10 unnamed defendants in the U.S. District Court for the Eastern District of Virginia alleged that they committed the theft using stolen customer credentials and specially designed software. Azure OpenAI ServiceSupported by Microsoft’s fully managed service ChatGPT manufacturer OpenAI technologies.
In the complaint, Microsoft accuses the defendants, which it refers to as legal pseudonyms, of violating the Computer Fraud and Abuse Act, the Digital Millennium Copyright Act and federal racketeering statutes by illegally accessing and using Microsoft software. and servers for the purpose of creating “offensive” and “harmful and illegal content.” Microsoft did not provide specific details about the offending content that was created.
The company seeks injunctive and “other equitable” relief and damages.
In the complaint, Microsoft says that in July 2024, it discovered that customers with Azure OpenAI Service credentials, specifically API keys and unique character strings used to authenticate an application or user, were being used to create content that violated the service’s acceptable use policy. Upon further investigation, Microsoft discovered that API keys had been stolen from paying customers, according to the complaint.
“The exact method by which Defendants obtained all of the API Keys used to commit the misconduct described in this Complaint is unknown,” Microsoft’s complaint states, “but it appears that Defendants engaged in a systematic pattern of API Key theft that enabled them to do so.” stealing Microsoft API Keys from multiple Microsoft customers.”
Microsoft alleges that the defendants used stolen Azure OpenAI Service API keys belonging to US-based customers to create a “hacking as a service” scheme. To pull off the scheme, the defendants created a client-side tool called de3u, as well as software to process and route communications from de3u to Microsoft systems, according to the complaint.
De3u allowed users to use stolen API keys to create images DALL-EMicrosoft claims that Azure OpenAI Service is one of the OpenAI models available to customers without writing their own code. De3u also attempted to prevent the Azure OpenAI Service from revising the instructions used to generate images, such as would happen if a text query contained words that triggered Microsoft’s content filter, according to the complaint.
The repo containing the de3u project code hosted on GitHub, a Microsoft-owned company, is no longer available at press time.
“These features, combined with defendants’ illegal programmatic API access to the Azure OpenAI service, allowed defendants to reverse engineer Microsoft’s content and abuse measures,” the complaint states. “The defendants knowingly and intentionally gained unauthorized access to the protected computers of the Azure OpenAl Service and caused damages and losses as a result of such conduct.”
a blog post In a filing Friday, Microsoft said the court allowed it to seize an “instrumental” website for the defendants’ operation, which would allow the company to gather evidence, decipher how the defendants’ alleged services make money and disrupt any additional technical infrastructure it finds. .
Microsoft also says it has “taken countermeasures” that the company did not mention and “added additional security measures” to its Azure OpenAI Service targeting the activity it observed.