The Microsoft AI team made a big mistake. They accidentally shared 38TB of the company’s private data, as revealed in a report by cloud security company Wiz.
This leaked data contained complete backups of two employees’ computers, which had sensitive personal information such as passwords for Microsoft services, secret keys, and over 30,000 internal messages from 350+ Microsoft employees on Microsoft Teams.
Hillai Ben-Sasson, Cloud Security Researcher, said: “We found a public AI repo on GitHub, exposing over 38TB of private files – including personal computer backups of Microsoft employees. How did it happen? A single misconfigured token in Azure Storage is all it takes”.
So, how did this happen? The report explains that the Microsoft AI team uploaded a batch of training data containing open-source code and AI models for recognizing images. They shared this on Github, a platform for code collaboration. When users visited the repository, they were given a link from Azure, Microsoft’s cloud storage service, to download the AI models.
The problem was that the link provided by Microsoft’s AI team gave visitors full access to the entire Azure storage account. This meant that visitors could not only see everything stored there but also upload, replace, or delete files.
Wiz explained that this occurred because of a feature in Azure called Shared Access Signature (SAS) tokens, which are essentially secure URLs that grant access to Azure Storage data. These tokens can be set up with limitations on what files can be accessed. However, in this case, the link provided had unrestricted access.
What makes matters worse is that this data had been exposed since 2020, making it a potential security risk for quite some time.
Wiz alerted Microsoft to this issue on June 22 of this year. Two days later, Microsoft invalidated the SAS token, closing the security gap. Microsoft conducted a thorough investigation and concluded in August.
Microsoft has reassured that no customer data was exposed, and there was no risk to other internal services due to this issue.