💾 Archived View for dioskouroi.xyz › thread › 29405277 captured on 2021-12-04 at 18:04:22. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2021-12-03)
-=-=-=-=-=-=-
________________________________________________________________________________
If you have unknown, or unpredictable access patterns, such as data lakes, analytics, or user generated content, the S3 Intelligent-Tiering storage class now automatically includes a new Archive Instant Access tier at the same price and milliseconds retrieval as the new S3 Glacier Instant Retrieval storage class. Beginning today, customers of S3 Intelligent-Tiering automatically save up to 68% for data not accessed in the last 90 days.
Lots of people using the Intelligent-Tiering storage class just got a huge cost reduction for free. If you haven't looked into it, you really should, especially if you use S3 for objects that are usually >1MB, stored for >30 days, and at least some of those objects go more than 30 days without being accessed. The downside is really limited; at worst you will pay a small fixed amount more than the Standard storage class, unlike normal Infrequent Access or Glacier Instant Retrieval storage classes where you can potentially end up with huge retrieval bills if objects are accessed repeatedly.
I checked, I was even more surprised by S3 Glacier Deep Archive at $0.99/_TB_
But ouch, retrieving a TB from it cost $120.
I guess this is great for those having everything on AWS.
I use Deep Archive for scenarios where I will be extremely pleased to pay $120/TB to have my data back.
We use deep archive to keep every single database log segment and nightly snapshot. This means we can restore to any single point in time. In reality if we ever restored it would be a small fraction of data, but we keep all of it around because it’s cheap enough.
Really awesome, have a usecase that's fits this nicely.