Create Bucket Policy in AWS S3 Bucket with Python Last Updated : 28 Mar, 2023 Comments Improve Suggest changes Like Article Like Report Bucket policy of s3 bucket means permission and action which can be applied on the particular bucket. AWS S3 has an optional policy that can be used to restrict or grant access to an S3 bucket resource. It is important to note that bucket policies are defined in JSON format. For creating a bucket policy in python we will follow the below steps: Step 1: The first step for creating a bucket policy is we need to import python SDK boto3. This will provide methods to us by that we can access the resources of the AWS. And for the policy string dumping, we need to also import JSON. import json import boto3Step 2: The Second step will be we need to create a policy string. Policy string is a key-value pair dictionary. In which the first key will be the Version. And the second key will be the statement in the statement first key will be the sid which will store how to type the policy we want to add. And the second key will be the Effect which will store the access status and the third key will be the permission which will store who have permission to access and the fourth key will be an action which will what type of operation we can perform and the last parameter in this Resource on which we will apply this policy. Example: bucket_policy = { "Version": "2012-10-17", "Statement": [ { "Sid": "AddPerm", "Effect": "Allow", "Principal": "*", "Action": ["s3:*"], "Resource": ["arn:aws:s3:::gfgbucket/*"] } ] } Step 3: The third step will need to convert the bucket policy string in JSON. json.dumps(bucket_policy) Step 4: The fourth step will be for putting bucket policy to the bucket we need to call put_bucket_policy() function .this function will take the first parameter the bucket name and the second parameter will be the policy string. put_bucket_policy(Bucket,policy) Step 5: The last step will be to go to AWS->S3->Bucket->Permission->Bucket policy and verify. Complete code: Python3 import json import boto3 s3_client=boto3.client('s3') BUCKET_NAME='gfgbucket' def create_bucket_policy(): bucket_policy = { "Version": "2012-10-17", "Statement": [ { "Sid": "AddPerm", "Effect": "Allow", "Principal": "*", "Action": ["s3:*"], "Resource": ["arn:aws:s3:::gfgbucket/*"] } ] } policy_string = json.dumps(bucket_policy) s3_client().put_bucket_policy( Bucket=BUCKET_NAME, Policy=policy_string ) Output: Comment More infoAdvertise with us Next Article Amazon S3 - Lifecycle Management C cse1604310056 Follow Improve Article Tags : Cloud Computing Similar Reads Amazon S3 - Creating a S3 Bucket Amazon Simple Storage Service (Amazon S3) or Amazon S3 is an object type, high-speed or with minimal latency, low-cost and scalable storage service provided by AWS. S3 also allows you to store as many objects as you'd like with an individual object size limit of five terabytes. It provides 99.999999 3 min read Amazon Web Services - Generating Log Bundle for EKS Instance Amazon SageMaker is used by data scientists and developers to easily and quickly prepare, build, train, and deploy high-quality machine learning (ML) models by bringing together a broad set of capabilities purpose-built for ML. In this article, we will look into how users can generate a log bundle f 2 min read Amazon Web Services - Creating a Lambda Function Pre-Requisites: AWS AWS Lambda is one of the compute services that helps you to run code without provisioning or managing resources servers. With lambda, user can run their code on a high-availability computing infrastructure. Lambda performs all the tasks related to the administration of the comput 5 min read Amazon S3 - Lifecycle Management An S3 Lifecycle Management in simple terms when in an S3 bucket some data is stored for a longer time in standard storage even when not needed. The need to shift this old data to cheaper storage or delete it after a span of time gives rise to life cycle management. Why is it needed? Assume a lot of 2 min read Adding a Bucket Policy by using the Amazon S3 Console Amazon Simple Storage Service, or Amazon S3, is a service for scalable and secure storage of objects, which are used for the storage, recovery, and delivery of data. Amazon S3 allows you to store and retrieve any amount of data at any time from anywhere on the web. One of the most critical aspects o 7 min read How To Upload And Download Files From AWS S3 Using Python? Pre-requisite: AWS and S3Amazon Web Services (AWS) offers on-demand cloud services which means it only charges on the services we use (pay-as-you-go pricing). AWS S3 is a cloud storage service from AWS. S3 stands for 'Simple Storage Service. It is scalable, cost-effective, simple, and secure. We gen 3 min read Like