Amazon AWS Certified SysOps Administrator Associate – S3 Storage and Data Management – For SysOps Part 8

  • By
  • June 9, 2023
0 Comment

22. S3 Glacier – Hands On

So let me show you how to use s three Glitch here as a storage class from Amazon s three. So if we take for example, this coffee, the JPEG file, and I’m going to do action, and then I can change H storage class, so I can do edit storage class, in which case I can go into Glacier or Glacier garcast, I’m going to go into Glacier, okay? And it has to stay there for at least 90 days, okay? So I will save my changes. And now my object is being sent into the Glacier storage class. So this is good. Now back in it. As we can see, the storage class has changed for my object. So in my coffee, JPEG, I can see in the bottom that the storage class is Glacier. And what I can do is that I can initiate a restore for it.

So I can do right here, the object is stored in the location storage class and to use it, I can initiate a restore. Now, I can say how many days I want the restored copy to be available, so I can say one day I want it to be available. Okay, so this will be available until that date. And then I have three options for retrieval. I have bulk, standard and expedited. But if I do expedited, I need to purchase what’s called capacity unit so PCU to allow me a self to go and retrieve this file faster. So we’ll go into a standard retrieval of these specified objects and I can initiate a restore.

Okay, so you can see the restore has been initiated and for me to be notified of this restore, well, I can use event notifications. So you can see there is a restoration in progress and it is going to be here. It’s going to be instead of retrieval. So we’re going to wait until this is done. But if I wanted to go into my bucket and then go into management properties, excuse me, scroll down and go to event notifications, I can create an event notification in my bucket called restore notifications. And the event type I’m going to look for is going to be around the restore object event.

So we have restore initiated and restore completed. So this is when I initiate restore request and it would have triggered a notification and restore completed is when the restoration is done. And then I can send as a destination to lambda function an SNS topic or an Sqsq and be notified accordingly. So that’s it. I will not create it, but it gives you all the options on Glacier, on AWS. I hope you liked it and I will see you in the next lecture.

23. Glacier Vault Lock – Hands On

So let’s have a play with vault locks. By going into the Glacier console and in there I’m able to go ahead and create a vault. So as you can see, Glacier is used for creating vaults, setting data retrieval policies and sending event notifications. So let’s create a vault and I’m going to create in my region close to me, I’ll call it my Demo Vault. Click on next step. And here we can set up notifications in case some jobs complete. So basically when a retrieval job is completed, we can receive an SMS notification. I will not enable it right now, we’ll set it later and then review. Everything looks good, I’ll submit it. And here is my first glacier vault. So basically I have just like a bucket, but it’s called Vault in Glacier. And so if I click on this Vault, I can get some information, I can get some information around the details of this vault, the notification that I set if I wanted to get information, the permissions.

So we can set a Vault Access Policy document I just mentioned to basically say who can do what on this vault, just like a S three bucket policy, really. And then a vault lock. And the vault lock basically allows you to create, edits and view details and to create a lock policy. So this lock policy gives you compliance. So if we create a lock policy and we can click here to see how to write a lock policy, we could set a specific kind of lock policy. For example, this one is to deny delete permissions for Archives less than 365 days old. Or the second one would be to deny permission based on the tag.

So you have lots of different ways to do Lux, but the idea is that using Lux you are able to get strong requirements on how your data is going to be in Glacier. The thing to know is that once you set a Luck, you cannot change it. So I will initiate Vote Luck, which says that I need to match my ARN, obviously. So let’s go to Glacier and find my ARN. So here’s my Lock and here’s my ARN. I’m going to copy this right here. So once I set my Vault Policy Lock, then it will not be able to be changed ever.

So I will click on initiate vault lock. And here I get a vault ID. So I need to absolutely copy this. You cannot lose this. And it’s saying we have 24 hours to validate this policy and complete the lock Process, after which the Lock ID will expire and your in progress policy will be deleted. So now I have 24 hours basically to complete the lock process. So let’s close this. And so here I have the option to either delete my Vault Lock or I have 23 hours to complete it. So I want to complete it. So I click on complete vault lock and I copy. I paste the lock ID I just got from before. And if you didn’t copy it, then you have to redo everything. So delete the vault lock and recreate it. Then I acknowledge the fact that my vault lock, once it’s configured, I will not be able to change it. Ever. It’s irreversible. That’s why it’s so strong from a regulatory perspective. And I click on complete vault lock. And here we go. My vault lock policy is now locked. And so we’ll never, ever, ever be able to delete an archive that is less than 365 days. Ever.

I cannot change this. And so this is why vault lock policies are so important and you need to know them. Coming into the exam, the last thing to know is that through this UI, you are not able to upload files directly to Glacier. You would have to use the SDK or the CLI or something like this. I won’t show you. But the idea is that you don’t get a full UI like you do for S three here. We have to use the API if we wanted to upload file into this demo vault. So that’s it for Glacier. Just remember how we created a vault, how we lucked it using some kind of luck policy, and how you could set another lock policy here for access to this vault. And I will see you in the next.

24. [SAA] Athena Overview

So let’s talk about Athena to conclude this section. Athena is awesome. To me, it is really really, really cool. It’s a serverless service and you can perform analytics directly against S Three files. So usually you have to load your files from S Three into a database such as Redshift and do queries there or something. But with Athena, you leave your files in S Three and you do queries directly against them.

For this, you can use the SQL language to query the files which everyone knows, and it even has a JDBC or ODBC driver. If you wanted to connect your Bi tools to it, you get only charged per query and for the amount of data scanned. So you can go really really crazy. And you can just get billed for what you are actually using. It supports many, many different types of file formats such as CSV, JSON ORC Avro Park. And in the back end, it basically runs Presto.

Presto, if you know, is a query engine. So the use cases for Athena are so many. But you can do Bi analytics reporting. You can analyze and query VPC flow logs, ELB logs, cloud trails, trails, S Three access logs, cloud front logs, all these things. So in the exam, they will ask you, hey, how can we analyze data directly on S Three? How can we analyze our ELB logs? How can we analyze our VPC flow logs? Well, the answer is use Athena. So that’s it. That’s all you need to know. We’re actually going to do one hands on just to get some practice with Athena and see how it works. But for the exam, it’s really, really straightforward. Anytime you need to analyze data directly on S Three, usually the logs or ELB logs, et cetera, you would use Athena. That’s it. I will you in the next lecture.

25. [SAA] Athena Hands On

So let’s demonstrate. S three access logging. So I’ll call it Demo s Three Access Logs stefan 2020 and then I will leave all these settings on and click on Create Buckets. Okay, so this is creating my buckets, and this bucket is going to be used for access logging from my other buckets. So let’s take my demo stiff on an S Three bucket 2020, and I’m going to turn on Server Access Logging. So for doing so, I’m going to go into Properties and I will scroll down and I will find the Server Access Logging in here. Okay, next I will click on Edit and then enable server access logging. Next I need to specify a target bucket so I can just browse Amazon S Three and look at this bucket that just created.

Choose the path and we can do, for example, logs slash. If you wanted to have all the server access logs to go under the S Three Lugs folder, it’s up to you. It’s optional, but add a trading slash at the end, save the changes and we’re good to go. So now it is enabled. And so the idea is that if I go and for example, list my versions, if I go and take this coffee JPEG file and open it and stuff like this, this is going to generate some traffic onto my bucket. Okay? And this is going to be logged onto my other buckets called the Demo S Three Access Logs to Fun. Now this takes, this can take an hour or 2 hours to appear. So I’m going to wait a little bit for it to be written. But one question you may have is how does this bucket by turning on the Server Access Logging, how is this bucket getting the right to write to my logging buckets?

And so it says it here by enabling server access logging, the S Three console will automatically update your bucket access control list, or ACL, to include access to the S Three lug delivery group. So let’s check this out. Let’s go to permissions of my demo s three access logs bucket. And under permissions, if I scroll down and go to Access Control List ACL, yes indeed, the S Three Log Delivery Group has the right to write my objects onto my S Three buckets. So this is something that has been added automatically by Amazon S Three when I did enable server access logging. So just an instagram t, but it’s always good to see the full security picture when I do something. Okay, so now the only thing I have to do is wait. So I’ll pause the video and hopefully within an hour or two I should be able to see some objects being populated in here. So I will see you very soon.

Okay, so I’m in my access log bucket. I’ve waited an hour, so hopefully if I refresh, yes, I start seeing S Three Logs folder that has been created. So perfect. And within that folder is contained a bunch of access logs of what has been done on my s three buckets. So I can take any of this file. I can take this one, for example, and I can download it. It’s a text file. So I’m going to open it with my text editor to see what’s inside. And so I just opened this file. And this is a text file. This is one line. So this contains one bit of information in this specific file. And this tells me about the request ID the bucket.

It was made on the time and date of this bucket request. The IP is coming from the fact that it was a get. So it was a get, and the result was 200. So it was a successful get on a bucket at the very top. So it was probably like a request down at the top of the bucket. So these type of access logs can be analyzed at scale using something like Athena that we’ll see in this course. And so on our own, they’re not very helpful. But if there are any problems, if there’s any authorization issues or attacks or whatever athena and analyzing these files and getting down to the bottom of it will allow you to get more insight into what is happening. So that’s it for access slugs. I hope you liked it, and I will see you in the next lecture.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img