In this blog you will learn how to edit storage class in AWS S3 bucket. It can be done through the AWS Management Console, AWS CLI, or programmatically using AWS SDKs. Here’s a guide on how to do it using each method:
Steps to edit storage class in AWS S3
Log in to the AWS Management Console and navigate to the S3 service.
Select the bucket containing the objects you want to change. In my case, i am selecting firstbucket0072.
Select the object(s) by clicking the checkbox next to the object name(s).
Click on the “Actions” button and choose “Change storage class”.
You can select any storage class based on your use case. Here IA stands for Infrequent access it means the file which are accessed infrequently.
Storage Class | Minimum Storage Duration | Retrieval Time | Use Case | Cost | Real world Example |
---|---|---|---|---|---|
Standard | None | Milliseconds | Frequently accessed data, primary storage for critical data | $0.023 per GB per month (for standard region). | Hosting a website or storing active data that’s accessed regularly |
Intelligent-Tiering | 30 days | Milliseconds/minutes | Unknown or unpredictable access patterns, cost optimization | $0.023 per GB per month for frequent access, $0.0125 per GB per month for infrequent access (plus monitoring and automation fees). | Storing data with unknown access frequency, like a data lake with varying workloads. |
Standard-IA | 30 days | Milliseconds | Infrequently accessed data, long-lived but less critical | $0.0125 per GB per month (storage), $0.01 per GB retrieval fee (standard region). | Backup copies or archived data that’s rarely accessed but needs to be quickly retrievable. |
One Zone-IA | 30 days | Milliseconds | Infrequently accessed data, non-critical, cost-sensitive | $0.01 per GB per month (storage), $0.01 per GB retrieval fee (standard region). | Data that can be easily recreated, like temporary backups or secondary copies. |
Glacier | 90 days | Minutes to hours | Long-term archival, rarely accessed data | $0.004 per GB per month (storage), retrieval fees vary ($0.03 per GB for expedited, $0.01 per GB for standard). | Long-term archival of compliance data or old project files. |
Deep Archive | 180 days | Hours to days | Very long-term archival, rarely accessed data | $0.00099 per GB per month (storage), retrieval fees vary ($0.02 per GB for standard retrieval). | Archiving old financial records or historical data that’s seldom accessed. |
Reduced Redundancy | None | Milliseconds | Non-critical, reproducible data | Around $0.0125 per GB per month. | Storing temporary or generated content like thumbnails of user-uploaded images or intermediate video processing files that can be regenerated. |
Once it is done click on Save changes
Now you have successfully edited the storage class.
Commands to change storage class in CloudShell
With the help of this commands you can edit the storage class in cloudshell. It consists of following this such as copying the object and specifying the storage class.
aws s3 cp s3://source-bucket/source-object s3://source-bucket/source-object --storage-class STANDARD_IA
Python code to change storage class
You can also do using Python Programming you need few things such as bucket name, and object key which is the filename location.
import boto3
s3 = boto3.client('s3')
bucket_name = 'your-bucket-name'
object_key = 'your-object-key'
new_storage_class = 'STANDARD_IA'
# Copy the object to the same location with the new storage class
s3.copy_object(
Bucket=bucket_name,
CopySource={'Bucket': bucket_name, 'Key': object_key},
Key=object_key,
StorageClass=new_storage_class
)
Conclusion
In this blog we have learned about what is storage class, various types of storage class and how to edit the storage in AWS S3 bucket.
You can tell your friends or coworkers about this blog if you like it. There are social media sites like LinkedIn, Twitter, and Instagram where you can find me.
- 👏 Like for this article and subscribe to our newsletter
- 📰 View more content on my DataSpoof website
- 🔔 Follow Me: LinkedIn| Youtube | Instagram | Twitter