In a blog post on Monday, Wiz researchers Hillai Ben-Sasson and Ronny Greenberg said a disk back-up was among the data exposed and this included secrets, private keys, passwords, and more than 30,000 internal Microsoft Teams messages.
Wiz, a company which was set up by former Microsoft engineers, recently did a deep dive on an Azure cloud breach suffered by Microsoft and revealed several problematic issues at the core of the intrusion.
The Monday post was about an incident that occurred on 22 June. Wiz said Microsoft had shut down the bucket two days later.
|
"The access level can be limited to specific files only; however, in this case, the link was configured to share the entire storage account — including another 38TB of private files," the two Wiz researchers noted.
"This case is an example of the new risks organisations face when starting to leverage the power of AI more broadly, as more of their engineers now work with massive amounts of training data.
"As data scientists and engineers race to bring new AI solutions to production, the massive amounts of data they handle require additional security checks and safeguards."
Ben-Sasson and Greenberg said they had come across the accidentally exposed data while scanning for misconfigured storage containers.
"In this process, we found a GitHub repository under the Microsoft organisation named robust-models-transfer," they wrote. "The repository belongs to Microsoft’s AI research division, and its purpose is to provide open-source code and AI models for image recognition."
Instructions for downloading data from the repository were provided as shown below:
But due to the misconfiguration, the URL provided access to the entire storage account, not just the open-source models.
"Our scan shows that this account contained 38TB of additional data – including Microsoft employees’ personal computer back-ups," the Wiz pair wrote.
"The back-ups contained sensitive personal data, including passwords to Microsoft services, secret keys, and over 30,000 internal Microsoft Teams messages from 359 Microsoft employees."
Apart from the wrong permissions, the token also allowed an attacker to delete or overwrite existing files.
The access level of an SAS token can be customised by a user and the use of such tokens is a security risk since information can be shared with external unidentified IDs, Ben-Sasson and Greenberg said.
They pointed out that SAS tokens had expiry problems, with there being no upper limit on expiry. In this case, the Microsoft token was set to expire in 2051.
They provided advice to security practitioners so that such issues could be avoided.