How data breaches forced Amazon to update S3 bucket security

Amazon launched its Simple Storage Service (better known as S3) back in 2006 as a platform for storing just about any type of data under the sun. Since then, S3 buckets have become one of the most commonly used cloud storage tools for everything from server logs to customer data, with prominent users including Netflix, Reddit, and GE Healthcare. While S3 rolled out of the gate with good security principals in mind, it hasn’t all been smooth sailing.

The issue of S3 bucket security has come to a head in recent years with prominent data breaches affecting companies like Uber, Accenture and even the United States Department of Defense. Nearly all of these breaches had one common factor – the administrator in charge of managing cloud storage misconfigured security settings, leaving them open to the public. You might be wondering how this keeps happening time and time again. Shouldn’t there be security defaults available to stop these breaches?

To Amazon’s credit, S3 has always used a model where new buckets are private by default. Administrators have control over what level of access they allow to the public, and other authenticated Amazon AWS accounts. The problem is, it hasn’t always been easy to tell what level of access the public has to any given bucket. It’s possible that you might end up with a bucket configured with restricted permissions but containing items that override those restrictions with public access.

Back in 2017, Amazon added additional security changes to help combat the growing wave of security incidents due to misconfigured S3 buckets. To start, they tagged all public buckets with a big orange notification. Additionally, they added settings like default encryption for all data uploaded to a bucket and detailed reporting to help identified misconfigurations. This clearly wasn’t enough, though. In the months after these changes, companies were still experiencing major data breaches from wide-open storage.

One of the major issues S3 users face involves all the overriding rules and access control lists that they add over time. While Amazon still blocks all public S3 access by default, users may occasionally need to temporarily allow public access to some bit of data. To facilitate this, an administrator might update their access control list to allow read access to the data with the intention of removing that rule later. Unfortunately, data access requirements change, and people are forgetful, which means there is a chance that rule might stick around for longer than intended, leaving the data accessible when it shouldn’t be.

It’s also possible for nested directories – each with their own individual permissions – to add further complexity to S3 bucket access and security. Eventually, you might lose track of the fact that a subdirectory of a subdirectory that is storing sensitive logs is actually publicly accessible.

Amazon took this issue head on in November 2018, when they added an option to block all public access globally to every S3 bucket in an account. This effectively gives administrators a reset button to wipe the slate clean, overriding any and all custom rules with a single click.

prevent Amazon S3 bucket misconfiguration

Sadly, these security updates are only beneficial if people know they exist and actually implement them properly. Unfortunately, due to user error and improper implementation, we will likely continue to see breaches involving misconfigured S3 buckets for as long as the service remains operational.

That said, if your company uses Amazon S3 (or really any cloud storage service), there are few things you can do to make sure you don’t end up in that list of breaches:

  • First, take some time to thoroughly review your current storage permissions. Review existing storage buckets and make sure you don’t have any outdated rules that might allow unintended access to your data.
  • Second, follow best practices when setting up new cloud storage or managing your current setup. Amazon has a great guide for securing S3 specifically in their help system, as does Microsoft for Azure Storage.
  • And finally, take a step back and consider the type of data you are uploading to the cloud. Just because you can store something doesn’t mean you necessarily have a business need for cloud storage. Reducing your data footprint not only lowers complexity, making it easier to spot misconfigurations, but also reduces the potential damage in the event you do become the victim of a breach.

Cloud storage is a great and usually cost-effective tool for storing company data, but there are major security issues it can cause for your business if you aren’t careful. Start with the above three best practices to minimize the risks and whenever you’re in doubt, find and chat with an S3 (or whichever platform you use) expert before moving sensitive information into the cloud.

Don't miss