Amazon AWS Certified SysOps Administrator Associate – S3 Storage and Data Management – For SysOps Part 7

  • By
  • June 8, 2023
0 Comment

19. [SAA/DVA] S3 Event Notifications – Hands On

Okay, so let’s demonstrate s three bucket notifications. For this, I’m going to create a bucket and I will call it Demo stefan twenty twenty s three bucket notifications. Okay. I will create it in EUs one. So, Ireland. And I will scroll the way down and create this bucket. So far, so good. Next, I will click and open this bucket. And I want to go into properties, and I want to change my event notification.

So as you can see, we can have multiple ones. So I want to create an event notification, and I’ll call it Demo event. And for the prefix, it’s optional, so I will set none. This way, it applies to all the objects within my bucket as well as cifix. It’s optional, but and so on. What is this event applying to? So I want to have this event triggered anytime an object is created.

So that is object created, put, post, copy and multipart upload completed. But you could also, if you wanted to have other camera events such as delete events, restore events and so on. So for now, I’m just going to select all object create events for s three object creative start. Next, we need to choose a destination. And so the destination could be a lambda function, an SNS Top Pick or an Sqsq. Now, we haven’t seen those yet, but we’ll be using an Sqsq and we will be creating a queue ourselves. So let’s go ahead and create a queue. So let’s go to SQS and create a queue for that matter.

So I will create a queue, and this queue, I’m going to call it Standard, and I’ll say demo s three notifications, we’re good to go. And I will scroll all the way down and click on Create Queue. Let me close these two things. So my queue is created, and what I need to do is to set up an access policy. And the access policy I need to set up is an access policy that will allow my s three bucket to write to my Amazon SQS queue. So no need to worry about what SQS is right now, but just follow with me.

So let me refresh this page just to make sure that I get my queue picked up. So I got demo events again, one more time for create objects and scrolling down Sqsq and find my demo s free notification. So if I save the changes right now, it’s going to give me an error because I’m unable to write to my SQS queue. So let’s change the access policy to allow people to write to my SQS queue.

So I will change it and the access policy itself, I can use a policy generator to define an Sqsq policy. And I will allow I’ll make it very permissive right now. I will allow anyone to do a send message into my Sqsq. And I need to find the ARN of my Sqsq. And the ARN is right here. So here we go. I will paste in the ARN, add a statement, generate this policy, and this should allow anyone, including my S Three Buckets to write to my Sqsq. Now, this is way too permissive, but good enough for this demo.

So I pasted it and then I will save this. So now my access policy should allow my S Three Bucket to write to SQS. So let’s save the changes one more time. And this time the operation successfully completes. Finally, let’s go and upload an object to see if that triggers an event. So we’ll add a file and we’ll add coffee JPEG and upload this file and see what happens.

Okay, so this file has been successfully uploaded. This is good. Now let’s go into Amazon SQS and under Amazon SQS, you can go to Send and Receive Messages on the top right corner and under Receive Messages, we can see two messages are available. So if we click on Pull for messages, this is going to receive these messages. So here they are. So the first message is this one. And if we look at the body of the message, we can see that this was a test event from Amazon is free. This is when Amazon S three was testing the connectivity between the S Three bucket and the sqsq. So we can safely disregard this event and the second event if we look at the body.

Now, this contains the actual notification that was sent by Amazon S Three into Sqsq. So we can see the sources aus S Three. The event time we can see it was in the EU one region, and the event name was object created Put. We can see who did it, some IP, and a lot of information about that event in particular. So this demonstrates that, yes, Rs Three Bucket is able to send notification into Amazon SQS and that completes the demo. So that’s it. I hope you liked it and I will see you in the next lecture.

20. S3 Analytics

So now we have s three analytics. And this is to set up storage class analysis. So basically we can help with the help of analytics, s three analytics to determine when we should actually transition an object maybe from standard to standard infrequent access.

And that does not work for analytics on one zone, IA or Glacier, but it works for the rest. And the report will be updated on a daily basis and it takes takes about 24 hours to 48 hours for the report to first start. And the idea is that once we get that report, then we know how to put efficient lifecycle rules. So we know exactly after how many days we should move objects from A to B. So this is the kind of stuff that we can do.

Okay, let’s see how we can enable it. Right now to enable it pretty easy, we go to analytics, and here in analytics, we’re supposed to be able to create a storage class analysis. So we have to create and add a filter. So I’ll call it Demo filter, and we could set a filter for a specific prefix, but we’ll set for everything. And that data could go into a destination bucket.

So maybe it could go into a bucket in this account or another account, but this is optional. I click on Save and now it says analyzing your data. And so this will take about 48 hours, 24 to 48 hours to be done. And because I don’t have much in this bucket, we’re not going to get very meaningful results. But let me show you one of my actual buckets that has a website. So this is my bucket for Kafka tutorial. com. And in there I already enabled analytics maybe less than a month ago to show you what it would look like.

And so I still don’t have any recommendations from Amazon analytics around when I should transition my objects to infrequent access. But basically we can start seeing some graphs around how much data and storage is used and how much data is retrieved. Then we get an information around the percentage of storage that I retrieve and then we can get even more information around how old my objects are. So because I don’t have that many objects, I don’t see much things here. But for example, we can see that some objects are between 90 and 120 days old and maybe a lot of them are over a year old and maybe less of them are less than a year old. So it gives you some insights.

And the idea is that as soon as you start having an S Three bucket with a lot more activity than mind, obviously, and a lot more objects, then you’re going to get some meaningful insights from S Three analytics. Right now I don’t have anything meaningful, but that’s fine, I’m sure you get the idea. So that’s it for s three analytics. Just remember what it’s used for basically to recommend to you when to move object to IA, for example. And I will see you in the next lecture.

21. S3 Glacier Overview

Now let’s talk about Amazon s three glacier. So this is a low cost object storage meant for archiving and backup and the data is going to be there for the long term. So we’re talking about tens of years. So it’s an alternative to running a magnetic tape storage on premises. So now instead of running things on premises and actually storing your data on tapes, you can do the same in the cloud. With Glacier, the durability is really good. It’s the same as Amazon s three standard.

So eleven nine s and the cost per storage depending and on your tier, if it’s tender it’s about 0. 4 cents per gigabyte and if it’s the deep archived tier it’s zero point, which is really, really good. So each item in Glacier is called an archive and an archive can be up to 40 terabytes and the archives are going to be stored in vault. So it’s the same as object and a bucket, but in a glacier it’s called archive and vault. By default, all the data will be encrypted at rest using AES 256 and the keys are managed by AWS.

So all the English share is automatically encrypted no matter what. So there is a tip. If you want to archive in s three after x number of days, then use a glacier and use a backup lifecycle policy. So a few Glacier operating entry knows. So you can create and delete vaults and you can only delete a vault when it’s empty. You can retrieve metadata from your vaults. So the creation date, number of archives, the total size of all the archives and so on. And you can download the inventory of your vault, which is a list of all the archives in it, including the archive ID, the creation date, the size and so on. The operations in your vault include uploads you upload files directly into Glacier or you can do it by parts if you want to use the multipart upload for larger archives.

And you can download files directly from your vault. This is after you have done a retrieval job, then Glacier will prepare that file for download and then you can download it during the time frame that’s given to you. Okay? And delete, if you want to delete obviously a specific archive. So when you restore a link, it has an expired date. And when you want to retrieve and create that retrieval links, okay, you need to have, you have three options you can do.

The first one is Expedited, which is the most expensive. And it gives you your file between one to five minutes interval so very, very quickly. And you’re going to pay three cents per gigabyte and ten dollars per one. Thousand request standard, which is slower, which is three to 5 hours, gives you one cents per gigabyte and request. And then finally bulk, which is the slowest but also the most cost effective, which gives you your files available between five to 12 hours from your request, which gives you zero point requests.

Okay? So next Glacier is really good because you can have vault policies and vault lock. So each vault will have one vault access policy and one vault lock policy. These policies are written in JSON, okay? And the first one is the Vault Access policy. It’s going to behave like an S three bucket policy in which that allows you to restrict user and account permissions, whereas a vault lock policy is a policy in which you allow you to lock for regulatory and compliance requirements, your files in your Glacier vault. That means that the policy is going to be immutable. It can never be changed.

So once it’s locked, the policy cannot be changed and it’s locked for the lifetime of the policy. And so, for example, an example of that would be to forbid deleting an archive if less than one year old, or for example, to implement a warm policy. So write once, read many, and to show that compliance and regular assurance that you cannot delete even the vault, the files within the vault, or the vault policy itself. So here’s my vault, and it has several archives. So the vault lock policy really allows you to lock down the archives in the vault as well as the log policy itself, and then the users have the permissions to talk to the vault. But you can also regulate that through a vault access policy, which is similar to an S three bucket policy. So because Glacier is asynchronous because you have to initiate restores and then they will happen in time, in the future, then you need to have notification for the restore operations.

So you can configure a vault notification, which is to send notifications directly to an SNS topic when the job of restoring is complete. And then you can also have an SNS topic when you initiate jobs. So the idea is that Amazon Glacier has a bunch of vaults with archives in it, and then the user will initiate a restore archive job. It could be, for example, in bulk.

So you get the result between five to 12 hours. Now it’s going to take a bit of time. So Amazon Glacier will initiate the restore of these files and when the archives are going to be ready to be downloaded, then Glacier can send a notification into an SNS topic which will notify us, maybe by email or whatever means you want. And then the user can go ahead and download the file within the time frame allowed for it.

The other option is to use S three event notification. So this is when you restore objects directly from the S three Glacier storage class, okay, in which case you can look for two notifications. The first one is S three Object restore post, which is to notify when an object restoration has been initiated. And the second one is S three Object restore completed, which is to notify when the object restoration has been completed. So that’s it for Glitch. Here. I hope you liked it and I will see you in the next lecture.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img