2

How to Use AWS S3 Securely: Setup Guide

 3 years ago
source link: https://www.varonis.com/blog/how-to-use-aws-s3/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Amazon led the way into Infrastructure-as-a-service (IaaS) with the introduction of Amazon Web Services (AWS). Netflix, NASA, and the U.S. Navy all use Amazon as their backend. AWS S3 is the storage service in AWS, and has been the culprit of more than a few major data breaches.

In this blog we will explain the basics of AWS S3 and discuss how to secure the system to prevent cybersecurity incidents.

What is AWS S3 (Simple Storage Service)?

AWS S3 is one of the core services in AWS infrastructure. Conceptually, it’s equivalent to an infinitely large file server on a remote site or FTP server.

Amazon S3 stores uploaded data as objects within buckets. S3 structures around Buckets and Objects rather than file servers.

An object can be a doc file or a video with some metadata describing the object. Buckets are the containers for an object. Admin can set up and manage access to each bucket (who can create, delete, and list items in the bucket), view access logs for it and its objects, and choose the geographic area where Amazon S3 will store the bucket and its contents.

Amazon S3 Setup Storage Types

Amazon has developed Amazon S3 as a highly durable and flexible solution that offers numerous storage options to meet individual customer requirements. These include:

  • Standard: Used to store performance-sensitive data with milliseconds retrieval time.
  • Standard Infrequent Access: Used to save data accessed infrequently.
  • One Zone-Infrequent Access: Used for objects that are rarely used and require less durability. Savings costs in contrast with other forms of storage.
  • Amazon Glacier: Amazon Glacier is used for archived data storage.

Where to use AWS S3?

aws use cases

There are a variety of use cases for AWS S3, see some examples below.

Internet Storage

Amazon S3 is perfect for storing application images and videos to render with faster performance. To this end, all AWS services (including Amazon Prime and Amazon.com), Netflix and Airbnb, use Amazon S3.

Disaster Recovery and Backup

The Amazon S3 is perfect for storing and archiving highly sensitive data or backup. It distributes across regions automatically and offers the highest possible availability and durability. You can make it easier to restore files or older copies with the Amazon S3 Versioning. It’s highly unlikely to lose details with Amazon S3 if you keep your recovery point goal (RPO) and recovery time goal (RTO) low.

Analytics

Amazon S3 offers advanced on-site querying capabilities to perform highly efficient analytics of S3 data. It removes the need for movement and data storage since it facilitates most services incorporating third parties.

Data Archiving

Users can store and transfer data from Amazon S3 to Amazon Glacier’s very cheap and durable compliance archiving solution. They can also automate archived data with a life-cycle policy that helps manage data with reduced efforts.

Static Website Hosting

Amazon S3 stores different static objects. One important use case is hosting static websites. More and more web apps are becoming single-page and static (Angular, ReactJS, etc.), where running a web server is expensive. S3 provides a static website hosting feature that allows you to use your domain without incurring massive hosting costs.

Security and Compliance

Amazon S3 offers several standard encryption and compliance features for PCI-DSS, HIPAA/HITECH, FedRAMP, FISMA, and others. These features help customers fulfill criteria for compliance for almost every regulatory agency worldwide. It also makes it easy to restrict access to sensitive data by bucket policies.

How to use AWS S3?

All S3 data resides in special global buckets, with several directories and subfolders. Pick a region when building a bucket to optimize latency and decrease access data costs. To set up Amazon S3, follow the instructions below:

Create S3 Bucket

See the steps and screenshots below to create an S3 bucket.

1. Sign Up

Sign up for the AWS Management console. After you sign in the screen appears below:

aws 1

2. Search for “S3”

Search for “S3” in the search bar and click on it

aws 2

The AWS S3 dashboard should look like this

aws 3

3. Click on “Create bucket”

Click on the “Create bucket” button to create an S3 bucket. When you press the “Create bucket” button, the screen appears below:

aws 4

4. Name the Bucket

Enter the name for the bucket

aws 5

There are many ways to set up S3 bucket permissions. Permission is private by default but may be modified using the AWS Management Console permission or a bucket policy. As a security best practice, you should be selective when providing access to the created S3 buckets. Only add essential permissions and avoid opening buckets to the public.

5. Configure Options (Optional)

You can pick the features that you want to activate on a specific bucket, such as:

  • Tags: You can tag a bucket with a key and a name that will make it easier to search for resources with tags.
  • Versioning: Keep track of all versions of the file, making it easy to retrieve the file in accidental deletion.
  • Object-level logging (Advanced Setting): Activate this function if you want to record any operation for any item in your bucket.
  • Default encryption: By default, AWS encrypts files via AES 256 with generated keys, but you can use your own managed key to encrypt items.
aws 6

6. Create Bucket

Finally, click on “Create bucket”

aws 7

AWS bucket Created

Note: The newly created bucket and its objects aren’t public.

How to Upload Files in the Created S3 Bucket?

To learn how to upload files in the created S3 bucket, see the steps below:

1. Click on Bucket Name

aws 8

2. Click “Upload”

aws 9

3. Click “Add Files”

aws 10

Add the desired files from the drive

4. Click on the “Upload” Button

aws 11

Upload status showed on the screen

aws 12

We observe that our document uploads to the newly created bucket from the above screen.

How to Access AWS S3 Bucket Data?

In order to access AWS S3 bucket data, you’ll need to follow each of these steps.

1. Click on File

aws 13

2. On Accessing the URL, We See This:

aws 14

From the screen above, we note that we are not allowed to access the bucket objects.

To solve the above problems, we need to set the permissions for a bucket.

3. Head to “Bucket Permission”

aws 15

4. Click “Edit” and Remove the Check From “Block All Public Access”

aws 16

5. Click “Save”

6. Make Uploaded File Public

aws 17

7. Now, the Object URL is Accessible

Important Concepts:

  • Buckets are a universal namespace, i.e., the name of the bucket must be unique as well.
  • If an object successfully uploads to the created S3 bucket, HTTP 200 code reflects on screen.
  • S3 Reduced Redundancy Storage, S3, S3-IA are the storage classes.
  • Encryption is of two types: Server Side Encryption and Client-Side Encryption.
  • Access to bucket management is either through ACL (Access Control List) or bucket policies.
  • By default, buckets are private, and all objects stored in a bucket are private.

How to Configure AWS S3 Securely?

With regards to AWS security concerns, the most vulnerable service is undeniably S3. Misconfigured S3 buckets have led to massive data violations involving major institutions such as FedEx, Verizon, Dow Jones, and even WWE. Such breaches were avoidable because AWS is highly secure if properly configured.

Let’s discuss the following AWS s3 security best practices to improve the protection of the S3 bucket:

Tip 1: Securing Data Through S3 Security Encryption

Encryption is an essential step in securing your data. S3 provides two ways to secure your data at rest:

  • Server-Side Encryption (SSE): Using this encryption form, AWS encrypts and stores raw data on its disks (on data centers). When trying to retrieve files, AWS reads, decrypts, and sends it back to you from its disks.
  • Client-Side Encryption (CSE): Using this form of encryption, instead of AWS, you encrypt the data before sending it to AWS. After AWS sends the data back, decryption algorithms function to decrypt data.

You may select a choice that matches your needs based on your security and compliance requirements. You can opt for Server-Side Encryption when you are fine with AWS handling the encryption process. If your data is sensitive and you’d like to encrypt it yourself, opt for Client-Side Encryption.

The example below shows how SSE-S3 buckets can protect data in an S3 bucket:

  1. Create S3 Bucket from the AWS S3 Dashboard
aws 3
aws 4
  1. After bucket creation, upload the data in the bucket.
aws 18
  1. Next, click on the uploaded object to see Encryption properties
aws 19

Encryption disabled by default.

  1. Click on the Edit button beside it.
aws 20
  1. Select “Enable” and click on “Amazon S3 key (SSE-S3)” This will place Encryption to AES-256 on Server-side for data encryption. Hit Save changes

Within seconds new encryption type will be reflecting on the screen

aws 21

Files securely encrypted on created S3 Bucket.

Tip 2: Access Control Manager

Access Control is the most critical pillar to enhance data protection further. We have listed five options for managing access to S3 buckets and resources. Let us analyze each method of access control to help you create an efficient AWS S3 security best practices mechanism:

1. IAM User Permit Restriction

Identity and Access Management (IAM) allows fine-grained access controls. By implementing the least privilege principle, the admin can delegate users with minimal access and resources to handle buckets or read/write data. This minimizes the risk of human-related errors, one of the top causes for misconfigured S3 buckets leading to data leakage.

As a rule, start with a minimum number of permissions required and gradually add permissions as needed.

2. Limiting S3 Security Access Using Bucket Policies

The bucket policies are the same as the IAM consumer policies, except that it is directly associated with S3 security services. Bucket policies allow you to be versatile and manage fine grain bucket access.

There are some cases where you would like to use bucket policies. Let’s look here at some of the typical scenarios.

3. Using S3 Access Points to Assign Access Policies

Amazon launched S3 Access Points during ‘Re-Invent 2019’ in Las Vegas, a feature that improves access control using mixed-use S3 buckets to make bucket policies easier to handle.

Until S3 Access Points, bucket policies served as a medium to handle all data within a bucket with varying permissions. S3 Access Points have a modern way of managing data on a scale.

How S3 Access Points Work?

S3 Access Points have unique hostnames and their own access policies that explain how to handle data using that endpoint. Access point policies are similar to bucket policies, except that they are only related to the access point. S3 Access Points can also limit to a Virtual Private Cloud (VPC) that helps to fire the S3 data on that private network. This means that each access point has a specific DNS name, making it easier for you to address your buckets.

4. Use ACLs to Monitor Access

ACLs (Access Control Lists) were the original method to control access to S3 before AWS IAM became popular. Misconfigured ACLs were one of the main reasons why S3 data leakage is so widespread.

The ACLs are applicable either on the bucket or on the level of the component. Simply put, the Bucket ACLs provide access control at the bucket level, while the Object ACLs offer access control at the object level. By default, Bucket ACLs only allow access to the owner of the account. Still, it is usually very simple to make the buckets publicly available, which is why AWS advises that you do not use them.

5. Using Amazon S3 Block Public Access

Lastly, Amazon provides a centralized way to limit public access to S3 services. Through Amazon S3 Block Public setting, you can bypass any previously set bucket policies and object permissions. Note that block public settings function for buckets, AWS accounts, and access points.

Tip 3: S3 Security reliability maximization with replication

Companies can improve S3 security and reliability by having a data protection policy aimed at maximizing resilience. Let’s analyze five such AWS S3 security best practice strategies:

1. Build Copies of Data

This is the most common approach since it strengthens data security. Using AWS Backup service, which supports most AWS services like Amazon EFS, DynamoDB, RDS, EBS, and Storage Gateway, you can centralize and automate backup processes.

2. Choosing Availability Levels

As S3 resources are available at various levels of availability, use IA storage for low priority workloads, and then switch towards a better service class when the IT workloads need higher availability. This ensures optimization based on workload requirements as overinvestment in storage is expensive.

3. Use S3 Versioning

Data also faces significant threats due to disasters and infrastructure failure. By considering S3 versioning to restore lost files, you can avoid tricky and tedious backup recovery processes. The S3 versioning process includes saving the S3 bucket object’s version whenever a PUT, COPY, POST, or Remove action gets performed.

4. Use Cross-Region Replication (CRR)

Use the Cross-Region Replication (CRR) functionality to solve the issue of a single point of failure and increase data availability. In addition to availability, CRR also helps meet regulatory requirements if you need to store the data around various geographic locations.

5. Use Same-Region Replication (SRR)

SRR is an excellent choice if regulatory enforcement needs data stored locally or in the same region. AWS uses a built-in data replication feature to duplicate an S3 bucket across storage devices in three physically different region-wide availability zones. This guarantees data protection and durability in case of infrastructure failure or disaster.

Tip 4: Enforce SSL For S3 Security

SSL is the preferred way to secure communication to S3 buckets. By default, S3 Bucket data is accessible using either HTTP or HTTPS, which means that an attacker could potentially have MITM (Man in the Middle) access to your S3 requests.

Tip 5: Use Logging

S3 bucket access logging is a function that gathers information on all requests made to a bucket, such as PUT, GET, and DELETE behavior. This empowers the security team to detect malicious activity attempts inside your buckets.

Logging is a recommended safety best practice that can help teams comply with regulatory requirements, detect unauthorized access to their data, or initiate a data breach investigation.

Tip 6: Use S3 object locking

S3 Object Locking makes it very difficult to remove data from S3. Malicious actors inflict harm to organizational data primarily in two ways:

By deleting data assets

By stealing data

S3 Object Locking works on the latter by preventing or overwriting the deletion of an object. It effectively makes the S3 object immutable in two ways: setting out a retention period or keeping a legal hold until its object released.

S3 Object Lock also allows you to meet WORM regulations’ requirements or create additional protective layers only for compliance purposes.

Use the following steps to make Object Lock when creating S3 bucket:

  1. On AWS Management Console, head to S3 under Storage and click on Create bucket:
  1. After entering the name of the bucket, you can see the checkbox under Object Lock in Advanced Settings, as shown below:
  1. Click on the ‘enable’ checkbox, and you will see the message that the object lock gets enabled. You can now click Next and continue to build your bucket.

Note: Object Lock only functions when versioning is active.

Properly managing and configuring your AWS S3 buckets is vital to maintaining a secure infrastructure, no matter where the servers are hosted. These steps in the guide are just the beginning. A comprehensive cybersecurity strategy also needs consistent monitoring for abnormal data access patterns and signs of potential compromise. Varonis acquired Polyrize to better meet this challenge of securing the new cloud frontier.

Click here to schedule a call with our cybersecurity experts to discuss how Varonis can help you manage your cloud infrastructure.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK