Home » How To » How to Write a File to AWS S3 Using Python Boto3

How to Write a File to AWS S3 Using Python Boto3

S3 is an object storage service provided by AWS. You may need to upload data or files to S3 when working with AWS SageMaker notebook or a normal jupyter notebook in Python.

Other methods available to write a file to s3 are:

  • Object.put()
  • Upload_File()
  • Client.putObject()

Prerequisites

  • Generate the security credentials by clicking Your Profile Name -> My security Credentials -> Access keys (access key ID and secret access key) option. This is necessary to create session to your S3 bucket.
  • Understand the difference between boto3 resource and boto3 clientObject.put() and the upload_file() methods are from boto3 resource where as put_object() is from boto3 client.

Installing Boto3

If you’ve not installed boto3 yet, you can install it by using the below .

You can use the % symbol before pip to install packages directly from the Jupyter notebook instead of launching the Anaconda Prompt.

 

%pip install boto3

Boto3 will be installed successfully.

Now, you can use it to access AWS resources.

Using Object.Put()

You can use the Object.put() method available in the S3 object.

It allows two ways to write into the S3.

  • Writing a text content to an S3 object
  • Writing contents from the local file to the S3 object

Write Text Data To S3 Object

In this section, you’ll learn how to write normal text data to the s3 object.

Follow the below steps to write text data to an S3 Object.

  1. Create a Boto3 session using the security credentials
  2. With the session, create a resource object for the S3 service
  3. Create an S3 object using the s3.object() method. It accepts two parameters. BucketName and the File_KeyFile_Key is the name you want to give it for the S3 object. If you would like to create sub-folders inside the bucket, you can prefix the locations in this File_key variable. For example, /subfolder/file_name.txt
  4. Create a text object that holds the text to be updated to the S3 object
  5. Use the put() action available in the S3 object and set the body as the text data. E.g. Body=txt_data. This will upload the data into S3 bucket.
  6. put() actions returns a JSON response metadata. This metadata contains the HttpStatusCode that shows if the file upload is successful or not. If the status code is 200, then the file upload is successful. Else, it is not.
Related:  Python Cheat Sheet for Beginners

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name for this object.

 

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

object = s3.Object('<bucket_name>', 'file_name.txt')

txt_data = b'This is the content of the file uploaded from python boto3'

result = object.put(Body=txt_data)

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

Output

File Uploaded Succesfully

This is how you can update the text data to an S3 object using Boto3.

Reading A File From Local And Updating It To S3

In this section, you’ll learn how to read a file from a local system and update it to an S3 object.

It is similar to the steps explained in the previous step except for one step.

You just need to open a file in binary mode and send its content to the put() method using the below .

Use only forward slash for the file path. Backslash doesn’t work.

Related:  How To Install PIP to Manage Python Packages on Windows

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name to this object.

result = object.put(Body=open('E:/temp/testfile.txt', 'rb'))

 

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

object = s3.Object('<bucket_name>', 'file_name.txt')

result = object.put(Body=open('E:/temp/testfile.txt', 'rb'))

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

You can check if the file is successfully uploaded or not using the HTTPStatusCode available in the responsemetadata.

Output

File Uploaded Successfully

This is how you can write the data from the text file to an S3 object using Boto3.

Using Upload_File()

In this section, you’ll learn how to use the upload_file() method to upload a file to an S3 bucket. It is a boto3 resource.

Follow the below steps to use the upload_file() action to upload the file to the S3 bucket.

  1. Create a boto3 session
  2. Create an object for S3 object
  3. Access the bucket in the S3 resource using the s3.Bucket() method and invoke the upload_file() method to upload the files
  4. upload_file() method accepts two parameters.
    • File_Path – Path of the file from the local system that needs to be uploaded. Use only forward slash when you mention the path name
    • Object_name – Name for the object that will be created by uploading this file

Unlike the other methods, the upload_file() method doesn’t return a meta-object to check the result. You can use the other methods to check if an object is available in the bucket.

Note: Using this method will replace the existing S3 object in the same name. Hence ensure you’re using a unique name for this object.

Related:  Python Pandas Basics Cheat Sheet

 

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

result = s3.Bucket('<bucket_name>').upload_file('E:/temp/testfile.txt','file_name.txt')

print(result)

The file is uploaded successfully. But you’ll only see the status as None.

Output

None

This is how you can use the upload_file() method to upload files to the S3 buckets.

Using Client.PutObject()

In this section, you’ll learn how to use the put_object method from the boto3 client.

Follow the below steps to use the client.put_object() method to upload a file as an S3 object.

  1. Create a boto3 session using your AWS security credentials
  2. Create a resource object for S3
  3. Get the client from the S3 resource using s3.meta.client
  4. Invoke the put_object() method from the client. It accepts two parameters.
    • body – To pass the textual content for the S3 object. You can pass the text directly. Or you can use the file object by opening the file using open('E:/temp/testfile.txt', 'rb')
    • Name – Name for the new object that will be created.

put_object() also returns a ResponseMetaData which will let you know the status code to denote if the upload is successful or not.

 

import boto3

#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)

#Creating S3 Resource From the Session.
s3 = session.resource('s3')

result = s3.meta.client.put_object(Body='Text Contents', Bucket='<bucket_name>', Key='filename.txt')

res = result.get('ResponseMetadata')

if res.get('HTTPStatusCode') == 200:
    print('File Uploaded Successfully')
else:
    print('File Not Uploaded')

A new S3 object will be created and the contents of the file will be uploaded.

Output

File Uploaded Successfully

This is how you can use the put_object() method available in boto3 S3 client to upload files to the S3 bucket.

Leave a Comment