Understanding Amazon S3 Storage Classes
S3 Class

Understanding Amazon S3 Storage Classes


Amazon S3 offers several storage classes, each designed for different use cases based on access frequency, performance, and cost requirements. Here’s a breakdown of the main S3 storage classes and when to use each:

1.?? S3 Standard:

o?? Use Case: Frequently accessed data.

o?? Details: Offers high durability, availability, and performance. Ideal for cloud applications, dynamic websites, content distribution, and big data analytics.

2.?? S3 Intelligent-Tiering:

o?? Use Case: Data with unknown or changing access patterns.

o?? Details: Automatically moves data between two access tiers (frequent and infrequent) to optimize costs without performance impact.

3.?? S3 Standard-Infrequent Access (S3 Standard-IA):

o?? Use Case: Infrequently accessed data that needs rapid access when required. ?such as backups or disaster recovery files1

o?? Details: Lower storage cost compared to S3 Standard, but with a retrieval fee.

4.?? S3 One Zone-Infrequent Access (S3 One Zone-IA):

o?? Use Case: Infrequently accessed data that does not require multiple Availability Zone resilience.

o?? Details: Lower cost than S3 Standard-IA, but data is stored in a single Availability Zone.

?

5.?? S3 Glacier Instant Retrieval:

o?? Use Case: Archive data that needs immediate access. like medical images or media assets

o?? Details: Low-cost storage with milliseconds retrieval time.

6.?? S3 Glacier Flexible Retrieval (formerly S3 Glacier):

o?? Use Case: Long-term archive data that is rarely accessed.

o?? Details: Low-cost storage with retrieval times ranging from minutes to hours.

7.?? S3 Glacier Deep Archive:

o?? Use Case: Long-term archive and digital preservation.

o?? Details: Lowest cost storage option with retrieval times of up to 12 hours.

8.?? S3 Outposts:

o?? Use Case: Data residency requirements that can’t be met by existing AWS Regions.

o?? Details: Stores S3 data on-premises.


S3 Storage Class Comparsion

  1. you can change the storage class of a object /file in Amazon S3. This can be done using the AWS Management Console, AWS CLI, or SDKs.

Refernce:

https://aws.amazon.com/s3/storage-classes/


要查看或添加评论,请登录

Arabinda Mohapatra的更多文章

  • Deep Dive into Snowflake: Analyzing Storage and Credit Consumption

    Deep Dive into Snowflake: Analyzing Storage and Credit Consumption

    1. Table Storage Metrics select TABLE_SCHEMA,TABLE_CATALOG AS"DB",TABLE_SCHEMA, TABLE_NAME,sum(ACTIVE_BYTES) +…

    1 条评论
  • Continuous Data Ingestion Using Snowpipe in Snowflake for Amazon S3

    Continuous Data Ingestion Using Snowpipe in Snowflake for Amazon S3

    USE WAREHOUSE LRN; USE DATABASE LRN_DB; USE SCHEMA LEARNING; ---Create a Table in snowflake as per the source data…

    1 条评论
  • Data Loading with Snowflake's COPY INTO Command-Table

    Data Loading with Snowflake's COPY INTO Command-Table

    Snowflake's COPY INTO command is a powerful tool for data professionals, streamlining the process of loading data from…

  • SNOW-SQL in SNOWFLAKE

    SNOW-SQL in SNOWFLAKE

    SnowSQL is a command-line tool designed by Snowflake to interact with Snowflake databases. It allows users to execute…

  • Stages in Snowflake

    Stages in Snowflake

    Stages in Snowflake play a crucial role in data loading and unloading processes. They serve as intermediary storage…

  • Snowflake Tips

    Snowflake Tips

    ??Tip 1: Use the USE statement to switch between warehouses Instead of specifying the warehouse name in every query…

  • SnowFlake

    SnowFlake

    ??What is a Virtual Warehouse in Snowflake? ??A Virtual Warehouse in Snowflake is a cluster of compute resources that…

  • Snowflake Architecture

    Snowflake Architecture

    https://airbyte.com/data-engineering-resources/snowflake-features ?? Snowflake: Merging the Best of Shared-Disk and…

  • All About dbt (Data Build Tool) with BigQuery ??

    All About dbt (Data Build Tool) with BigQuery ??

    What is dbt? dbt is the T in ELT (Extract, Load, Transform). It allows you to write SQL queries that transform raw data…

  • DLT Append Flow(Union) & Autoloader | Pass parameter in DLT pipeline |Generate tables dynamically

    DLT Append Flow(Union) & Autoloader | Pass parameter in DLT pipeline |Generate tables dynamically

    Create a folder and then create two workflows DLT PIPELINE FOLDER WORKFLOWS Streaming Tables: orders_autoloaders_bronze…

社区洞察

其他会员也浏览了