Amazon AWS Certified SysOps Administrator Associate – S3 Storage and Data Management – For SysOps Part 3
7. [SAA/DVA] S3 Replication – Hands On
Okay, so let’s have a play with S Three replication. So I’m going to create a first bucket. So demo stiffan Origin 2020 in Euest One And I’m going to enable bucket versioning because for replication you need to enable bucket versioning on both the Source and the Target buckets. So I will create this bucket. And so this is my first bucket created, Demo Stefan Origin 2020 and it was an EU West One. But I’m also going to create a second bucket called Demo Stefan Replica 2020. And this time I’m going to choose something like in the US or US East One, for example, to replicate my data from Europe into the US. Maybe it is a disaster recovery strategy of yours. Okay, so I will scroll down, I will enable Bucket versioning again and click on Create Buckets.
Okay, so both my buckets have been created. So my Origin Buckets and my Replica Buckets and let’s open them in new tabs. What I want to do is start uploading one object into my demos defined Origin 2020. So I will upload yet again the Coffee JPEG file and upload it. Here we go. And so right now, as we can see, it is in my Demo Origin Buckets. But obviously we haven’t configured anything, so it is not yet in my Demo Replica Buckets. Okay, so for this now, I want to set up replication. So you need to go into management. And under management there is replication rules. So we can create a replication rule here, and I’ll just call it Demo Rule. And the rule is enabled. And what we do is that we can limit this rule to a specific scope within the bucket, so a subset of objects, or we can specify that this rule applies to all objects in the bucket. So it’s up to you. But right now we want to keep it simple. Let’s apply it to all objects in the bucket. Next, we need to choose a destination.
So it could be a bucket in this account or in another account. Let’s choose a bucket in this account. We’ll browse s three and find the demo replica buckets. So we’ll choose this. And as we can see, the destination region is US. East northern Virginia. US. East one. And so this is Cross Region replication. But if the destination region was the same, it would be the same region replication for your replication settings. Okay, finally, you need an IAM role to perform this action. And so we can just ask amazon is free to create this role for us, which is very nice in terms of advanced settings. I’m just not going to check anything. We just want to demonstrate the capability right now.
So let’s save this. And one thing I want to show you, sorry, the Delete Marker replication right now is disabled. So delete markers are not replicated. But if you enabled it, Delete Markers would be replicated. And I will demonstrate this in a second. But right now, I will leave this as delete marker replication off.
Okay? I will save this. And now my replication role has been created and correctly configured. So what you would expect is that if we go into the Origin Bucket, we see one object coffee, the Jpg. And if we go into the Replica Bucket and refresh, well, we’ll still see zero objects. So it turns out that as soon as you enable replication, it will not replicate old objects for your previous objects. It will only replicate newer objects. So how do we fix this? Well, we can upload the Coffee JPEG file again. So let’s upload coffee JPEG and click on upload. Now, this has uploaded the Coffee JPEG file right here again in my bucket. So if I list versions, now have two, and you can see we have a new version of D, of Q, Ddrh, whatever.
Now, if I go into my Replica Bucket and refresh, within a few seconds, I should be seeing this object popping up. And here we go. My object has now appeared. And we can look at the fact that the version ID of this coffee, the JPEG, is the exact same. So Qdqrh is the exact same as the version ID I have on my Origin Bucket. So the object is replicated, including the object version ID. And if I upload, for example, well, a beach JPEG, it is obvious that it is also going to be replicated. So hopefully this makes sense. Now let’s look about deletes. So if I take my Coffee JPEG and delete it, I’m going to create a Delete marker and press and click on Delete. So this delete marker that is created is not going to be replicated.
So let’s prove it to ourselves. So right now, we only have the beach JPEG in here, in the Origin Bucket. And if I refresh here, I have the beach and the Coffee JPEG. So the delete marker did not get passed on to the Replica Bucket, and we saw it from before. This was just a setting in the replication rule. So if we specify, we could also tell Amazon is free to replicate these delete markers. But this is something you have to opt in by default. And finally, if I list my versions and say I wanted to delete a very specific version of my files. So let’s say delete all these files right here, and I will permanently delete them because I’m deleting specific version IDs in my bucket, these deletes are not going to be replicated.
So again, replication rules do not replicate deletes. And there’s no way to replicate or delete actually, between two buckets right now. So this is something good to know and definitely a behavior you should be aware of at your exam. Okay? And that’s it for replication rules. Obviously, you can set up different replication rules in your bucket based on different prefixes and filters, and you can edit the rule if you wanted to. For example, if you wanted to scroll down and delete market replication as well to enable it. But for now, we won’t do it. We saw the basics of replications and that should be enough for the exam. I hope you liked it and I will see you in the next lecture.
8. [SAA/DVA] S3 Pre-signed URLs
So now we’re talking about s three presign URLs and so we’ve seen them before but now we’re going to do hands on. So you can generate a presign URL using either the SDK or the CLI and the easy thing to do is for downloads we can just use a CLI but for uploads it’s a bit harder and you must use the SDK to use them. Nonetheless it’s quite easy and we’ll do downloads in this lecture. Now, when you generate a presign URL, by default it will have an expiration of 3600 seconds, which is 1 hour and you can change that timeout using an expires in parameter argument and you specify the time in seconds and when the user you give it, you give him a presigned URL. Basically they will inherit your permissions. So the one permissions that created the object so they can do, get or put accordingly. So why would you do this? Well, there’s many reasons.
Maybe you want to allow only logged in users to download a premium video on your s three bucket. So you only want to give a download link for maybe 15 minutes to a premium user that is logged in. Maybe you have an ever changing list of users that need to download files and so you don’t want to give them access directly to your bucket because it could be very dangerous or it’s not maintainable because there are so many new users all the time.
So maybe you want to generate URLs dynamically and give them the URLs overtime by presigning them all of them. And then maybe you want to allow temporarily a user to upload a file to a presence location in a bucket, for example. Maybe you want to allow a user to upload a profile picture directly onto our s three bucket. And so for this we would generate a presan URL. So there could be a lot of use cases but let’s go ahead and see how we can generate a presan URL for a download.
9. [SAA/DVA] S3 Pre-signed URLs – Hands On
Okay, so let’s demonstrate presigned URLs. And to do so, let’s take a bucket, for example, this one demos define origin that is not public. So this bucket is not public. And we’ll take the beach and we remember that if we do object actions open, this generates a presign URL for us. And we can see the beach as we can see in this bucket, even though it’s not a public bucket. But if we use the public object URL, then we obviously get an access denied. Now, the question is, how do we generate this URL? Because right now it’s done through the console and it works really great. But let’s have a play with it, with the CLI. So for it, that’s pretty easy.
We have the presign URL sh command right here that we can look at, and the command is AWS s three presign. Then you specify the bucket and object uri and then the region you’re in, obviously for the command to work. So do not forget this. So let’s have a play. So here it is. presign. We need to give the object S three Uri. So I will copy this uri right here. I will set the region.
So I’m in. So it’s EU west one for that bucket, I believe. And then finally, I will specify expires in 300 to say that in 300 seconds this urine will be gone. So this has generated for me a preset URL. And I can use this presend URL to have a look at my object. So let’s copy this entirely. So I’ll copy it. Here we go. Copy. And then open a new tab, paste it, and yes, I do have access to the beach.
Pretty simple. And something I found is that if you’re do get some issues with some errors, then try to run this command a list configure set default as racing your version to S three Three four, but only if you get issues. If you don’t, just using this command, you’re good to go. And that’s it’s. Fairly simple, but we’ve seen how to generate a preset URL that allows us to view an object that’s from a bucket that is not public. So that’s it. I hope you liked it and I will see you in the next lecture.
Interesting posts
The Growing Demand for IT Certifications in the Fintech Industry
The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »
CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared
In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »
The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?
If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »
SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification
As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »
CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?
The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »
Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?
The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »