How to Read File Content from S3 Bucket with Boto3 ?
Last Updated :
23 Jul, 2025
AWS S3 (Simple Storage Service), a scalable and secure object storage service, is often the go-to solution for storing and retrieving any amount of data, at any time, from anywhere. Boto3 is the AWS Software Development Kit (SDK) for Python, which provides an object-oriented API for AWS infrastructure services. It allows Python developers to build applications on top of Amazon services.
Prerequisites
- AWS account: Before starting this tutorial, you must have an AWS account. Read this article if you don't have an AWS account.
- S3 Bucket: You should have a bucket set up in your S3. Refer to this article if you haven't made it yet.
- AWS CLI: You should have AWS CLI set up on your local machine with access. Refer to this article to set up AWS CLI.
- Python and Boto3: Must have Python installed in your system and the Boto3 package.
Step-By-Step Guide to Read Files Content from S3 Bucket
Steps to Create S3 Buckets and Upload Files and Folders
Step 1: Login into the AWS console.
Step 2: After signing in, you will land on the AWS Management Console page and search for S3 as shown below.
AWS Management ConsoleStep 3: From the sidebar go to Buckets. Click on Create bucket this will create a bucket.
Create Bucket FormStep 4: Enter your bucket name, make sure your bucket name is unique.
Bucket nameStep 5: Create bucket, no need to change other things keep them as it is.
Create BucketStep 6: Goto buckets page again, this will list all buckets.
BucketsStep 7: We will upload and read files from 'gfg-s3-test-bucket'. Open your bucket.
Open BucketStep 8: Click on the Upload button. You can also Create Folder inside buckthe et. Select Add File/ Folder to add them.
Upload Files/FoldersStep 9: Verify if files/folders added properly or not, then Upload it.
Test.txt is running
GFG Test
Test1.txt is running
Reading contents from file using boto3
VerifyStep 10: All the files are uploaded successfully, now we can start reading those using Boto3.
Upload SuccessfullyStep to Read Files or Folders using Boto3
Step 1: Import all the necessary libraries, we use dotenv to access environment variables and load them.
import os
import boto3
from dotenv import load_dotenv
Step 2: Create an S3 Client that provides all the necessary methods to work with the S3 bucket. Provide Access key and Secret Access Key using os.
# Create S3 client
s3 = boto3.client(
"s3",
aws_access_key_id=os.getenv("ac_key"),
aws_secret_access_key=os.getenv("sac_key"),
)
Step 3: Store the bucket name in a variable.
# Store bucket name
bucket_name = "gfg-s3-test-bucket"
Step 4: Make a list of all objects in a bucket using the list_objects_v2() method and get all the content or metadata of objects.
# Store contents of bucket
objects_list = s3.list_objects_v2(Bucket=bucket_name).get("Contents")
Step 5: Iterate over a list of objects.
# Iterate over every object in bucket
for obj in objects_list:
Step 6: Store the object name using the 'Key' attribute in the object contents.
# Store object name
obj_name = obj["Key"]
Step 7: Fetch and store all contents of an object using get_object(), which takes the bucket name and key or object name resulting in the dictionary.
# Read an object from the bucket
response = s3.get_object(Bucket=bucket_name, Key=obj_name)
Step 8: Read object data from the body attribute of the response and decode it.
# Read the object’s content as text
object_content = response["Body"].read().decode("utf-8")
Step 9: Finally print all the contents of the respective file.
# Print all the contents
print(f"Contents of {obj_name}\n--------------")
print(object_content, end="\n\n")
Here is the complete code for Read file content from S3 bucket with boto3
This Python script uses the Boto3 library to interact with AWS S3. It first loads AWS credentials from environment variables using the dotenv module. Then, it creates an S3 client using these credentials. The script lists all objects in a specific S3 bucket, retrieves each object's content, and prints it to the console. Finally, it decodes the content from bytes to a readable string using UTF-8 encoding.
import os
import boto3
from dotenv import load_dotenv
# Load Environment Variables
load_dotenv()
# Create S3 client
s3 = boto3.client(
"s3",
aws_access_key_id=os.getenv("ac_key"),
aws_secret_access_key=os.getenv("sac_key"),
)
# Store bucket name
bucket_name = "gfg-s3-test-bucket"
# Store contents of bucket
objects_list = s3.list_objects_v2(Bucket=bucket_name).get("Contents")
# Iterate over every object in bucket
for obj in objects_list:
# Store object name
obj_name = obj["Key"]
# Read an object from the bucket
response = s3.get_object(Bucket=bucket_name, Key=obj_name)
# Read the object’s content as text
object_content = response["Body"].read().decode("utf-8")
# Print all the contents
print(f"Contents of {obj_name}\n--------------")
print(object_content, end="\n\n")
Output:
Final OutputContents of Test.txt
--------------
Test.txt is running
GFG Test
Contents of Test1.txt
--------------
Test1.txt is running
Reading contents from file using boto3
Conclusion
Reading files from an AWS S3 bucket using Python and Boto3 is straightforward. With just a few lines of code, you can retrieve and work with data stored in S3, making it an invaluable tool for data scientists working with large datasets.
Similar Reads
How To Read File From Amazon S3 Bucket Using Node.Js ? The AWS Simple Storage Service (S3) is a cloud service provided by Amazon Web Services (AWS) to store your data securely. There are different approaches to storing and retrieving data from AWS S3; one of them is by using aws-sdk provided by Amazon Web Services. In this article, we will provide you w
5 min read
How To List S3 Buckets With Boto3 ? Amazon S3 and Amazon Lambda are two crucial AWS services used in different organizations across the industry. Amazon S3 is used to store and manage data while Amazon Lambda provides serverless computing service to run code without any management of the server. In this guide, I will first discuss bri
5 min read
How to Delete S3 Bucket from AWS CLI ? Amazon Simple Storage Service (S3) is a versatile, scalable object storage service given by Amazon Web Services (AWS). It is intended to store and recover any amount of data from any place on the web, offering developers and IT teams a durable, dependable solution for their storage needs. S3 is gene
6 min read
How To Configure SSL For Amazon S3 Bucket? Amazon S3 bucket is like a virtual storage container in the cloud where we can securely store and manage our files like images, videos, documents, etc. Configuring SSL (Secure Socket Layer) for our S3 bucket is important, as we store our valuable information in it. In simple terms, it is like settin
3 min read
How To Retrieve the Sub Folders Names In S3 Bucket Using Boto3? Amazon S3 (Simple Storage Service) is one of the leading data storage and management services in today's modern cloud storage solutions. S3 provides scalable and secure storage for a massive amount of data. In S3, data gets structured in buckets and objects, with objects representing the basic units
5 min read
How To Download Folder From AWS S3 CLI & UI ? The AWS Simple Storage Service (S3) is a cloud service provided by Amazon Web Services (AWS) to store your data securely. There are different approaches to storing and retrieving data from AWS S3; one of them is by using AWS CLI provided by Amazon Web Services. In this article, we will provide you w
4 min read