Day 43: S3 Programmatic access with AWS-CLI 💻 📁

Day 43: S3 Programmatic access with AWS-CLI 💻 📁

S3

s3

Amazon Simple Storage Service (Amazon S3) is a highly scalable and durable object storage service provided by Amazon Web Services (AWS). It is designed to store and retrieve any amount of data from anywhere on the web, making it a fundamental building block for a wide range of cloud applications and services.

Here are some key features and characteristics of Amazon S3:

  1. Object Storage: S3 is an object storage service, which means it stores data as objects rather than as files in a hierarchical file system. Each object consists of data, a unique key (or identifier), and metadata.

  2. Durability and Availability: Amazon S3 is known for its high durability and availability. Data stored in S3 is distributed across multiple data centers within an AWS region, providing redundancy and resilience. AWS guarantees a high level of durability, typically 99.999999999% (11 nines).

  3. Scalability: S3 scales automatically to accommodate growing amounts of data. You can store virtually unlimited data in S3, and it is suitable for both small-scale applications and large enterprises.

  4. Security: S3 provides various security features, including access control through AWS Identity and Access Management (IAM), bucket policies, and access control lists (ACLs). You can encrypt data at rest using server-side encryption (SSE) or manage encryption keys yourself with customer-managed keys.

  5. Data Lifecycle Management: You can define data lifecycle policies to automatically transition objects to different storage classes or delete them after a specified period. This helps optimize storage costs.

  6. Versioning: S3 supports versioning, allowing you to preserve, retrieve, and restore every version of every object stored in a bucket. This is useful for data protection and compliance.

  7. Data Transfer Acceleration: Amazon S3 Transfer Acceleration allows you to upload and download objects to/from S3 faster by using Amazon CloudFront's globally distributed edge locations.

  8. Static Website Hosting: You can use S3 to host static websites, making it an affordable and scalable solution for serving web content.

  9. Data Analytics: S3 integrates with various AWS analytics and big data services, such as Amazon Athena, Amazon Redshift Spectrum, and AWS Glue, to enable powerful data processing and analysis.

  10. Content Distribution: S3 can be used in conjunction with Amazon CloudFront (AWS's Content Delivery Network) to distribute content globally, reducing latency for end users.

  11. Data Import/Export: AWS offers services like AWS Snowball and AWS DataSync for efficient and secure data transfer to and from S3, especially for large-scale data migrations.

Concepts in S3

Buckets: S3 organizes data into containers called buckets. Each bucket has a globally unique name and serves as a logical container for objects.

Objects: Objects are the fundamental entities stored in S3. They consist of the data you want to store and associated metadata.

Keys: Keys are unique identifiers for objects within a bucket. They represent the object’s path and can include prefixes and subdirectories to organize objects within a bucket.

S3 Versioning: S3 supports versioning, which enables you to store multiple versions of an object. This feature helps in tracking changes and recovering from accidental deletions or modifications.

Version ID: In Amazon S3, when versioning is enabled for a bucket, each object can have multiple versions. Each version of an object is assigned a unique identifier called a Version ID. The Version ID is a string that uniquely identifies a specific version of an object within a bucket.

Bucket policy: A bucket policy in Amazon S3 is a set of rules that define the permissions and access controls for a specific S3 bucket. It allows you to manage access to your S3 bucket at a more granular level than the permissions granted by IAM (Identity and Access Management) policies.

S3 Access Points: S3 Access Points in Amazon S3 provide a way to easily manage access to your S3 buckets. Access points act as unique hostnames and entry points for applications to interact with specific buckets or prefixes within a bucket. Here are some key points about S3 Access Points:

Access control lists (ACLs): ACLs (Access Control Lists) in Amazon S3 are a legacy method of managing access control for objects within S3 buckets. While bucket policies and IAM policies are the recommended methods for access control in S3, ACLs can still be used for fine-grained control in specific scenarios.

Regions: S3 is available in different geographic regions worldwide. When you create a bucket, you select the region where it will be stored. Each region operates independently and provides data durability and low latency within its region.

You can access Amazon S3 and its features only in the AWS Regions that are enabled for your account.

How Amazon S3 works?

  1. You create a bucket. A bucket is like a folder that holds your data.

  2. You upload your data to the bucket. You can upload files of any size, and you can even upload folders and subfolders.

  3. You can access your data from anywhere. You can use the Amazon S3 website, the AWS Command-Line Interface (CLI), or any other application that supports Amazon S3.

Task-01:

  • Launch an EC2 instance using the AWS Management Console and connect to it using Secure Shell (SSH).

  • Create an S3 bucket and upload a file to it using the AWS Management Console.

  • Access the file from the EC2 instance using the AWS Command Line Interface (AWS CLI).

Launch an EC2 instance using the AWS Management Console and connect to it using Secure Shell (SSH).

Connect to it using SSH:

Create an S3 bucket and upload a file to it using the AWS Management Console.

Search S3 in the AWS Management Console.

Click on Create Bucket.

Bucket name: day43task

Upload some data in the bucket.

To check the S3 buckets present:

  1. Now, SSH to the Ubuntu machine and access the data from CLI -

  2. Run the command aws s3 ls to list all buckets and aws s3 ls s3://bucket-name to access data inside a bucket.

Let us sync the contents of the local folder with our bucket.

touch file{1..10}.txt
ls 
#aws s3 sync <local-folder> s3://bucket-name
aws s3 sync . s3://s3awsbucket.01

This sync can be verified in the console:

To list the objects in an S3 bucket:

aws s3 ls s3://day43task
aws s3 ls s3://bucket-name

To delete an object from an S3 bucket:

aws s3 rm s3://bucket-name/file.txt
aws s3 ls s3://day43task

To create a new bucket:

aws s3 mb s3://bucket-name
aws s3 ls

To delete a bucket from S3:

aws s3 rb s3://bucket-name

Task 2:

Create a snapshot of the EC2 instance and use it to launch a new EC2 instance. Download a file from the S3 bucket using the AWS CLI. Verify that the contents of the file are the same on both EC2 instances.

Steps -

  1. Go to EC2 > Snapshot > Create Snapshot

Once the snapshot is created, create an image (AMI) from the snapshot.

  1. Access the AMI and create a new EC2 instance from AMI.

    1. Connect to both old and new EC2 machines and validate if we can see the same data in S3 buckets.

    2. Run the command aws s3 cp s3://day43-task-bucket/test_file.txt . to download a file from the bucket to the local machine.

  1. Select the Instance > Actions > Click on Launch Instance from AMI

And that’s how we can access and modify S3 and its contents through CLI.

In this blog, I have discussed S3 programmatic access with AWS CLI. If you have any questions or want to share your experiences, please comment below. Don’t forget to read my blogs and connect with me on LinkedIn and let’s have a conversation.

If you have questions or experiences to share, please feel free to comment below. You can also connect with me on LinkedIn; my name there is Akash Singh. I'm open to suggestions and corrections to improve my blog.

#Day43 #90daysofdevops

Thank you for reading!

Thank You! Stay Connected ☁️👩‍💻🌈

Contact me at :

LinkedIn: linkedin.com/in/akash-singh-48689a176

E-mail: /

Did you find this article valuable?

Support Akash-DevOps by becoming a sponsor. Any amount is appreciated!