The token utilized by Microsoft not solely allowed entry to extra storage by accident by vast entry scope, nevertheless it additionally carried misconfigurations that allowed “full management” permissions as a substitute of read-only, enabling a attainable attacker to not simply view the non-public recordsdata however to delete or overwrite current recordsdata as properly.
In Azure, a SAS token is a signed URL granting customizable entry to Azure Storage knowledge, with permissions starting from read-only to full management. It could actually cowl a single file, container, or total storage account, and the consumer can set an non-obligatory expiration time, even setting it to by no means expire.
The total-access configuration “is especially fascinating contemplating the repository’s unique goal: offering AI fashions to be used in coaching code,” Wiz stated. The format of the mannequin knowledge file meant for downloading is ckpt, a format produced by the TensorFlow library. “It is formatted utilizing Python’s Pickle formatter, which is susceptible to arbitrary code execution by design. Which means, an attacker might have (additionally) injected malicious code into all of the AI fashions on this storage account,” Wiz added.
SAS tokens are troublesome to handle
The granularity of SAS tokens opens up dangers of granting an excessive amount of entry. Within the Microsoft GitHub case, the token allowed full management of permissions, on the complete account, perpetually.
Microsoft’s repository used an Account SAS token — one in all three varieties of SAS tokens that additionally embody Service SAS, and Consumer Delegation SAS — to permit service (software) and consumer entry, respectively.
Account SAS tokens are extraordinarily dangerous as they’re susceptible by way of permissions, hygiene, administration, and monitoring, Wiz famous. Permissions on SAS tokens can grant excessive stage entry to storage accounts both by extreme permissions, or by vast entry scopes.