SAP-C02 Amazon AWS Certified Solutions Architect Professional – New Domain 5 – Continuous Improvement for Existing Solutions Part 14

  • By
  • September 1, 2023
0 Comment

60. Patch Compliance with SSM

Hey everyone, and welcome back. So, continuing our journey with the Systems Manager, today we’ll be focusing specifically related to the patch compliance. So recently I have been working with one of the startups. So one of the challenges with them was that they were running a public facing endpoint and they had an open source application which they were running and they were hagged. Now, on the further investigation, we came to know that the reason why they were compromised is because the application that they were running on the public endpoint had a lot of critical vulnerabilities which the attacker were able to exploit. Now, they would have been saved if they would have updated their application with the latest security fixes.

And same goes with the servers as well. Servers also have a lot of packages, and each package might have some kind of a security vulnerability that will be exposed today or tomorrow. Now, it is necessary to ensure that our servers are up to date. Now, there is a feature called Patch compliance. Basically, Patch compliance allows us to unknow the patch status associated with a specific instance. So in the earlier lecture, we had installed the SSM agent in one of the EC two instance. And now within the patch compliance, you see that it is showing that one instance is missing the updates. Now, how did this appear? Now, in order for all of these things to appear, what you need to do is you need to run a document name call as AWS Run Patch Baseline.

So let’s try it out, AWS Run. Let’s see if we can find it out from here. It seems it doesn’t work via search, so let’s quickly find it out. Yeah. So this is the one, which is AWS run patch baseline. So this is the document name and you have to select the instances by default since we only have one instance, this is the instance which is selected. So once you’ve selected it, you can just click on Run and this will be executed. So it might take a little time. And once this AWS Run Patch baseline gets executed, you can go to the patch compliance and it will show you the missing updates. Now you can definitely update the packages as well from the Systems Manager.

Maybe it can be done via Run command itself. So in the exam you might get a question like how can you know which instance is missing which patch? And the answer to this should be the patch compliance only. So there can be multiple options. But just remember that Patch compliance from Systems Manager allows you to find which instance is missing updates. And in case you want to update the system, you can do it again with the Systems manager, with Rand Command also. So, a small lecture. I hope this has been informative for you and I look forward to seeing the next lecture.

61. EC2 Systems Manager – Parameter Store

Hey everyone and welcome back. In today’s video we will be discussing about the Parameter Store. Now, Parameter Store is a great feature as part of the Systems Manager which AWS has released and it basically allows us to offload the secrets from the development code. So let’s go ahead and understand more about the Parameter Store. Now, going by the definition Systems Manager, parameter Store provides a centralized store to manage the configuration data, whether it is a plaintext or it can be a secret such as tokens or passwords. So let’s understand this with a simple diagram where on the left hand side you have an application code and within the application code you have created a function. Now within the function you have defined two values.

One is a username and you have a password. So this is something that I’m sure that most of you might be familiar with. The problem with this kind of approach is that most probably developer will commit this code within the Get repository and it will get leaked. And the second problem is, once the developer commits the code, all the people or all the team members will also be able to see the username and password. And in case someone is leaving the organization, he will take this username and password as a parting gift with him. So this is quite not a recommended practice. Now, the way in which you can improve upon this kind of scenario is to use of a Parameter Store. So on the right hand side, again you have an application code. Now within this function, the password, instead of hard coding the password here you are having a variable where you are getting the password from the Parameter Store. So you have an SSN getparam.

So what you ask your developer to do is you ask them to fetch the password from the Parameter Store. So when the application gets built within the EC, to instance during the build time, this value will be populated. So not only it is a good practice, but also developer will not be aware about this specific secret. So this is a great thing that should be done in the organization. So let’s go ahead and do practical which will give us the much more clear visibility. So I’m in my EC to console and on the bottom there is an option of Parameter Store. I’ll click on here and I’ll click on Get Started.

Now, so this will take us to the option of creating a parameter. Let me create a parameter, I’ll name it as RDS password. Within the description I say this is the RDS password and within the value let me give a random value. All right, so this is some random value that I have given and I’ll click on Create Parameter. So our first parameter is created and the parameter name is RDS password and the value associated with the parameter is the one that we had given.

So now what would happen is you can ask developer to fetch the value associated with this specific Parameter Store when the application is getting deployed within the EC to instance. So as far as the CLI is concerned, let’s look into how we can fetch the value from the AWS CLI. In order to do that, I’ll do an AWS SSM CLI in Google and it will take us to the systems manager CLI documentation. We are more interested in Parameter Store service. So I’ll just do a Get parameter and I’ll click here.

So this is the CLI command that we are interested in AWS SSM get Parameter. And within this there is a recommended or a mandatory value of the name that you need to give. So let’s go ahead and look into how we can work on the CLI aspect. So I’m in my CLI and let’s do AWS SSN get Parameter. And the mandatory value was the parameter name and our parameter name was RDS password. If I press Enter you will see it will give us the value associated with this parameter. And same thing you can ask developer to do, as we have already discussed, you should give them the parameter name and they will have an SDK or they will go through a CLI to fetch the value which the application will use.

So basically application will not really have any secrets, it will in turn fetch it from the Parameter Store. So this is the basic about Parameter Store. There is one more part that I wanted to show you is let’s go to create parameter the name. This time I’ll give an RDS secure. And within the type you’ll see there are three types which are available. One is string. Second is String list. Third is secure. String. So Secure string is basically it will store your passwords or your secrets in a secured manner. So whenever I click over here, it will give us the Kms key ID. So basically, whatever value that you will be putting, it will be encrypted with the Kms. So by default it will take the default AWS SSM kms the value associated with this. Let me give some random value.

Now, the difference that you will see over here, whatever value that you are typing, it has not been shown as the plain text, it is actually encrypted. So if I go ahead and create a parameter now within the RDS parameter which is of type string, if you’ll see the value is directly shown to us. However, if I type on RDS Secure or if I select this, the value is not shown as a plain text by default. However, if you have a proper permission, you do have an option in which you can get the plain text value. So if I click on show you see it has given us the plain text value wire.

So let’s look into the difference that it would make us when it comes to the AWS CLI. Or even AWS SDK. So within the AWS CLI I’ll run the same command which is AWS SSN get parameter. The name would be RDS secure and the output that it has given us is the value. Now, if you will see, this is the value which is encrypted with the Kms. So this is not the plain text value. Now when you get the parameter associated with the string, it would directly return as the plain text value. However, if you do a secure string, it will not return as the plaintext value.

This is the encrypted value. Now in order to have a decrypted value, there is an option within the AWS CLI and within the CLI if you’ll see that is the option of with decryption. So with decryption will basically tell parameter store to give us the decrypted value. You also have no width decryption, which is that it will give us the encrypted value. So this is the default choice. So let’s do one thing. I’ll do a with decryption and this time it gave us the plain text value. So this is what the parameter stores.

62. S3 Storage Classes

Hi everyone and welcome back to the Knowledge Foot video series. Now, today’s topic is S three storage classes and this is again one of the very important topic for exams. So I hope you look into this. Well. Now generally there are various storage classes with Amazon provides. One is the general purpose which is also called as the standard standard S three. The second is infrequent Access, which is called as the standard IA or standard infrequent Access. The third is reduce redundancy storage, which is RRS. And the fourth is Archive which is Glacier. So depending on which one of them you choose, the amount of money that you pay for storage and the amount of availability and the durability of your data differs.

So generally in F three, whenever you put an object, you have to select one of these storage classes or by default Amazon will have general purpose of standard S three as a default option. So let me just show you on where exactly do you see these things.

So this is Amazon s three. Now if I just go to KP Labs Hyphen Finance, this is my object over here. And if you’ll see the object is associated with the standard storage class. Now if I just upload a sample file, let me just go through a sample and here if you set the details, you see it gives you an option on which storage class you want to upload the data into. So by default the standard storage is selected, but you can always select the other two as well. One thing remember, you cannot directly upload to Glacier, so you won’t have the Glacier option here. Okay? So coming back to the presentation now let’s explore all of these storage classes that Amazon provides. Now before we do that, let’s understand the two very important concepts which are durability and availability.

So Durability is a percent over one year of time that the file which is stored in S three will not be lost. So basically what durability says is that, let me give an example. So if someone is designing a storage system for me as a client, I need to make sure that my data will always be there, it will not get lost. So what as a client I have to ask on how durable your system is, right? Because I cannot afford for the data to get lost. And this is what Durability says. Basically, Durability says that what are the chances that your data will be lost in person over a period of time? Availability, on the other hand, says what are the chances that your data will not be available for you to access over a period of time. Now generally what happens is for servers, so if you’re hosting a website, availability is a key metric like your website should be available 99. 99% of time and that is what most hosting providers guarantee.

But generally what happens if the component or the server or a data center itself fails. So this becomes the durability. Like if a data center itself fails, then what will happen? Then the availability itself will get lost. So durability and availability are the key aspect of S three, going back to the S three over here. So here if you see durability means that this file is available or I can download this particular file, it is not lost. Availability says that whenever I want, like if I want to open this file, this file has to be available for me. So this is availability. And durability says that I am able to see this file. So this file is not lost. If this file is lost, then I lost the durability as well.

So I hope you got the concept of availability and durability. So as I am able to access this file right now, it is both available and durable. Okay, two more important concepts covered. Now let’s go and explore the S three storage classes. The first is the Amazon. AWS s three standard. Now, this is by the default storage for all the S three objects. And it provides the durability of 99 point eleven nine of object, percent of objects and availability of 99. 99%. So again, for example, if you have 10,000 files stored in S three with eleven nine S durability, then you can expect to lose one file every 10 million years.

So this is the durability which Amazon S three standard provides. Now, talking about the second class, which is the Amazon S three standard infrequent access. Now, this is basically used for data which is accessed less frequently. But whenever it is accessed, we want the rapid access of that data. So in here, if you see the durability is 99 point total eleven nine durability. But the availability is less. The availability of data is 99. 90 compared to that of 99. 99% of S three standard. And generally S three standard is the most expensive storage among all the classes which are available. Talking about the third type which is RRS or reduce redundancy storage.

Now, this is for non critical reproducible data. So Amazon says that if your data is reproducible, then only you should store to RRS. So here the durability is much less than that of standard, which is 99. 99% durable and available is 99. 99%. So way lesser durability than that of S three standard. Now again, this is the comparison start of durability standard versus IA standard, IA versus RRS and that of availability. One important thing to remember, do understand the durability and availability aspect of all the three classes in Amazon S three. Now, the last important thing that we have to cover is glacier.

Now, glacier is meant for archiving data. Archiving data means the data which we need, but the ones which we won’t really open for a long time. Let’s say if you have a production server locks of one year older, generally the chances of you opening those locks are very, very minimal. So you can directly send them to Glacier. Now, important thing if you’re storing Glacier and if you want to open the data or if you want to download those files, it might take several hours for the file to be restored for you to download. So the durability is same for Glacier as compared to S Three standard. However, the Glacier is way, way cheaper than that of Amazon S Three. Now, important thing to remember is that any time you can change the classes of your object, so currently, if the object is in the S Three standard, you can easily move it to either RRS or infrequent Access and vice versa. So let’s see on how we can do that. So currently, the finance, the TXT is in the storage class standard. So if I go to Properties over here, and if I go to details, I can actually change the storage class to RRS. So it is basically moving to the Reduce Redundancy storage. Now, again, we discuss in the PowerPoint presentation, if you see that Glacier is much, much cheaper than that of S Three.

Now, the reason why it is cheaper is because the availability of the object actually takes several hours. So let’s look into the S Three calculator pricing calculator. So this will give you a better idea about I should actually go to Google. Okay, so this is the link. So this is amazing application which actually gives you a proper cost estimate for the resources that you use. So let’s go to S 3% and let’s assume that you have 1 data. So it is 10, two for GB. And if you see, the monthly estimated will is $23 for just 10 two for GB. So let me just remove this down. Let’s make it zero.

Now, let me go to Glacier. Are you able to find Glacier out here or am I missing? I guess here it is. Okay, so now, we learned that Amazon for storing 1 data in S Three, it will take $23 per month. Let’s explore how much the Glacier takes. And if you see, it is just $4 a month. So, a difference in pricing between storing your data in S Three and storing your data in Glacier. Now, one thing that you can do is you can automatically change the data from S Three and move it to Glacier through the Amazon Lifecycle Policies. So, let’s say, for example, if you are storing a file and a log file, so after six months of your log file storage in S Three Amazon, you can set up a Lifecycle Policy where Amazon will automatically move it to Glacier. So that kind of operations or that kind of automation is available from Amazon S Three console.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img