A Gentle Introduction to Cloud Computing in Python with AWS

A Gentle Introduction to Cloud Computing in Python with AWS

Jan 18, 2021ยท

8 min read

Featured on Hashnode


Cloud computing is among the latest buzzwords these days. So let's talk about it.

But what does it mean? What is "The Cloud"? โ˜๏ธ

Time to burst bubbles. ๐Ÿ˜ฌ

The Cloud is simply some computer sitting across the Internet available to be used as a service. It comes with a bunch of software and tools for computing. This includes creating and managing servers, storage, networking, and pre-built software services.

That's all.

There are many Cloud Service Providers (CSPs) including Microsoft (Azure), Amazon (AWS), Google Cloud, Alibaba Cloud, IBM, and more.

Amazon Web Services (AWS) are one of the biggest players in cloud services so we would be using them in this guide.

They offer high-level services and tools that make software development a breeze.

In this guide, we are going to look at how to access and take advantage of these services using our favorite programming language - Python

Let's get right into it.

Create an AWS Account

The first thing you need is an AWS account. Click the link to get a free account: aws.amazon.com/free

Follow the process to create an account. Once fully done and verified, move on to the next step.

Hint: you may need to link your credit card for verification

Done? Good. Let's continue.

Setup Credentials and Access

To test our brand new account, we are going to be using Amazon S3, which is a storage service for file storage. Here's how its described by Amazon:

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance.

Log into your AWS account to access your dashboard.


Click on the "IAM" service to manage access to AWS resources

Screenshot 2021-01-16 150701.png

Add users to your account

Click on the "Users" tab. When it loads, click on the "Add Users" button

Screenshot 2021-01-17 094321.png

Enter your "user name" and select "Programmatic Access" as your Access Type and click Next (to go to permissions)

Screenshot 2021-01-16 151133.png

Next, create a group.

Screenshot 2021-01-16 151223.png

Fill the group form and add Policies users within that group can access.

Enter your "group name" and search for policies to be added to that group. In this case, we are adding "EC2 and S3"

Screenshot 2021-01-16 151713.png

Next, select the group with permissions to access the Policies added above.

InkedScreenshot 2021-01-16 151801_LI.jpg

Adding tags is optional, so we are going to skip them. So click on Next and continue. Now, you should be on the summary page. Here's how mine looks like:

Screenshot 2021-01-16 151915.png

Next, download your credentials which contains your AWS "Access Key and Secret Key"

Screenshot 2021-01-16 152010.png

Woooh! We're done with the setup. Time to reap the fruits of our "cloud" labor.

Playing with Amazon S3 Storage

Amazon offers developers a beautiful Python SDK ~ Boto3

An SDK is the short term for a Software Development Kit. As the name implies, SDKs allow us to create apps on a platform in a programming language we understand without interacting directly with the platform. They are one notch above APIs. They help reduce the learning curve needed to use a platform. Learn more.

Boto3 provides access to both client and resource API. We would be using the client API because it is more mappings to the AWS platform.

Ok. Time for coding some stuff!

1. Create an isolated dev environment.

virtualenv venv && source venv/bin/activate

2. Install Boto3

pip install boto3

3. Install AWS console

Here's a recommended guide to doing that based on your operating system: docs.aws.amazon.com/cli/latest/userguide/in..

Install the latest version (currently version 2). Confirm your installation with:

aws --version

Done? Good. Let's continue.

4. Set up your credentials on your machine.

Remember the .csv we got from the last section? Well, time to put it to good use. You could display your credential with a CSV reader. For easy parsing of .csv within my editor, I used a VSCode Extension. Download here

On your terminal, enter the following command:

aws configure
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY

5. Create an S3 bucket

Fire up your editor or jupyter notebook to crunch some code.

You must have a valid AWS Access Key ID to authenticate your requests else your buckets won't be created.

Note that NOT every string is acceptable as a bucket name.


In production settings, you may want to pay attention to latency. By default, the bucket is created in the US East region. You should optimize your buckets for your region. For example, if your users are mainly from Europe, configure your region to Europe(Ireland). This reduces your costs too. Learn more

Instantiate Boto3 client

import boto3

s3 = boto3.client('s3')
s3_bucket = s3.create_bucket(ACL='public-read, Bucket='your_bucket_name')

print(s3_bucket) # to confirm your bucket was created

6. Upload files to Amazon S3

Create a text file called 'sample.txt' in your current directory and fill it with random dummy text.

# continuing with our instance from above

s3.meta.client.upload_file('sample.txt', 'your_bucket_name', 'sample')

7. Get file in our S3 Bucket

sample_file = s3.get_object(Bucket='your_bucket_name', Key='sample')

And lastly...

8. Download a file from our S3 Bucket

with open('sample_downloaded.txt', 'wb') as data:
    client.download_fileobj('your_bucket_name', 'sample.txt', data)

That's it.

The interface (client and resource) offered by Boto3 makes it easy to communicate with AWS.

We didn't get to take an in-depth look at AWS in this guide. The purpose of this material is to get you started as quickly as possible. To get a deeper understanding of the AWS SDK, I recommend you take a proper look at Boto3 documentation.

Watch out for more guides on building complex cloud solutions with AWS in the future.

If you'd like to chat and talk about tech in general, feel free to send me a message on Twitter or in the comment section below.