SAP-C02 Amazon AWS Certified Solutions Architect Professional – New Domain 5 – Continuous Improvement for Existing Solutions Part 9

  • By
  • August 29, 2023
0 Comment

49. AWS X-Ray Practical

Hey everyone and welcome back. In today’s video we will be implementing a sample application based on elastic beanstalk which will have the xray enabled and we’ll look into the entire process on how exactly things work. Like so I am in my elastic beanstalk page. So there is a URL for a sample application. I’ll be posting it along with the forum so that you can keep up with. So once you put that URL in the browser automatically you will be presented with the sample application. So for the platform it is based on Java and we’ll use the configure more options because by default this is a high availability environment. So we’ll try to stay under free tire, not spend money unnecessarily. So I’ll use single instance over here and I’ll click on Save. So easy to instance make sure it is t two micro because sometimes it’s t two nano which is actually chargeable.

So this is something that you need to take care of. Now, within the software AWS X ray makes sure that the X ray daemon is enabled. If you remember the architecture we had the SDK which is integrated with the application which sends the data to the Xray demon. So Xray demon is something which runs under the host on Linux, even on Windows. So this is the demon that we need to make sure that it is enabled and we’ll click on Save so everything seems to be perfect.

We’ll go ahead and click on create an app. So our application is being created and it would take a few minutes for the entire process to complete. All right, so it took around five minutes for the application to get deployed. Now, one thing that we need to do and even if you look into the architecture, so the X ray DM, it runs under the Linux box and it sends data to the X ray API. Now, in order for that to happen, you need to enable the policy related to X ray under the im role of the EC to instance. So this is one thing which is important. So this is a score keep environment. Now if you go within the im role we’ll add certain policies over here.

Now there is an administrator access policy which is assigned. So this is just for testing purpose that you can do so that you don’t really have to worry about much details related to permission issues. Perfect. So now if I go to the URL here, it basically says congratulations and this is a sample X ray page. Now there is an option for generating sample traffic. So this is where you can generate some amount of sample traffic. Now, once you have done that, you can open the AWS X ray console and this is where you will have the alt races in. So within X ray you can click on Get Started and just click on Cancel so that you can go to the service map directly. So your service map is getting computed, it takes little amount of time in the initial run and if you’ll see I have the service map which is already created for elastic beanstalk environment and then it is going to the AWS s three.

So this is what it is all about. Now if you go to the traces you will be able to see traces related to various requests and in case of error you will be able to see those detail within the service map itself. So if you will click over here you will see the response time distribution and you can actually search based on the response status which has been received by the application. So this is about it on how to have an x ray application definitely the sample application that we created already had X ray integrated so it already had the X ray SDK which is integrated. It also had the X ray demon and the X ray demon in turn connected with the Xray API.

So I hope in a high level overview you understood what it is all about. Now, the reason why I wanted to have sample application is so that you can have a hands on feel on how exactly that traces how exactly the raw data and everything really looks like in a perfectly working Xray conditions. So this is it about today’s video, I hope this has been informative for you and I look forward to seeing you in the next video.

50. AWS Rekognition

Hey everyone, welcome back. Hey everyone, welcome back. In today’s video we will be discussing about the AWS Recognition service. Now AWS Recognition is a deep learning based virtual analyst service. Now AWS Recognition is a deep learning based virtual analysis service and it basically allows us to us to integrate a very powerful visual analysis based feature within our application. Now, building a visual analysis feature is extremely difficult. So what AWS has done is that they have already built it and we can make use of SDK or even CLI to integrate with our application. So that’s the only thing that we have in our slide. I really wanted to directly jump into the demo to really show you on how amazing this service is. So I’m in my AWS management console and if we go to Recognition, let’s click here. So this is how the AWS Recognition GUI really looks like.

So let’s jump directly into demo so that you can know on what this service is all about. So let’s go to object and scene detection. And if you will see over here you have a photo and this photo has multiple objects. You have a person over here, you have a car over here. And basically when you upload a picture in the response, the recognition service will tell you on what are the objects which are present within this specific picture with the accuracy person type.

So you have the car, you have the automobile vehicle. And if you look into the request, this request is basically a skateboard based Jpg image. So let’s look into few more. Let’s look into the facial analysis. So this is a picture of a female. And here you see it says it looks like a face with the accuracy of 99. 9%, appears to be female. It shows you the age range say 20 to 38 years. She’s smiling, appears to be happy and not wearing glasses. She is wearing glasses. The accuracy here is little less and this is how it really looks like. Let’s look into a few more interesting ones.

One is celebrity recognization. So here it is easily able to so when you upload a picture, it can tell you that this picture belongs to a specific celebrity with a specific match confidence. You also have one more and again it will give you a specific match confidence and it will give you the name of a service. Again, there are a few more. One is the face comparison where you can compare the face from one picture to the face of another picture to see if there is a similarity or not. And you also have text in image. So let’s say you have an image which contains the text. This service can extract the text for you. Great. So let’s try it out. So these are the some of the demo images that AWS had. So what we’ll do, I have few sample images from my phone. We’ll use that to see how accurately the service determines the objects from the images. So what I’ll do, I’ll upload the picture.

So this was one of my recent travel. So this is basically an ATR flight. And basically, you see, it has determined that this is an airport, it has determined that this is an airplane. And you have vehicle also, because you see, it has determined the truck here, it has determined the car and so on. So this has been accurate for us. Let’s try one more. So this is a random image of a laptop on a desk. So here you see it has determined 99. 9% accuracy. This is the furniture. Then you have a table, then you have a computer. Now definitely it might not really recognize that this is a Mac or some other brand, if you might upload, but it will definitely tell you, all right, so this is a computer. So let’s try one last image here.

So this is a unique image among the three. So this used to be one of my rabbit, where we had then sent him to a relocation center where it had a lot of space to run around. So he was very happy there. So now you see, you have a plant here which is with the accuracy of 92. 6% confidence that it is saying. It is saying that it is animal. There’s an animal. You have a food also, so you can do a show more and you see, it is also able to recognize a rabbit with a specific confidence.

So this is a great service and it really has been able to determine with great confidence the objects or even the animals which are present within the animals which are present within the photo. So that’s about it. About the recognisation service, one important part that if you’re integrating with the application, your application can send a request of the photo and within the response score, it will actually tell you each and everything that you see in the GUI with a specific confidence related parameters. So that’s about it for the recognition service. I hope this has been informative for you and I look forward to seeing the next video.

51. AWS SAM

Hey everyone and welcome. In today’s video we’ll be discussing about AWS Sam. Now, Sam is also referred to as Serverless Application Model and it is just like an extension to cloud formation which is generally used for building serverless applications in AWS. Now basically, we can make use of Sam to define serverless application in simple and clear syntax. So this is a nice little diagram from the documentation that I took, all credits to them. So this is the sam. So basically you can define Sam in JSON as well as YML format. Now then you can use Sam to build templates that define your serverless application. So it might be API gateway, it can also be lambda function and others. And you can deploy the Sam template with the help of AWS cloud formation.

So, very simple thing, you write your JSON templates based on JSON or YAML templates based on Sam and you just deploy it with the help of AWS cloud formation. Now, there are two important steps that you need to remember during the Sam process. First one is that you create a Sam template. It can be either JSON or YAML that defines the lambda, API gateway or others that you need for your serverless application. Then once you create your Sam template, you test upload and deploy your application with the help of Sam CLI. Now, one important part to remember is that whenever you deploy so during the deployment, sam automatically translates the application specifications into the cloud formation syntax. So this is the basics on what Sam is.

So let’s do one thing, let’s go to the practical aspect and look into how exactly it would work. So, I have created a directory where I have populated all the commands that you would need to practice this lab. So it really becomes easier for us to understand. So, whenever you deploy Sam, remember that Sam needs to be installed. So there is a separate package that you need to install in the CLI. So what I’ll do, I’ll have an Amazon Linux based machine and we’ll be deploying all the commands there. So let’s do one thing, I’ll just copy the first command and I have a server connected. I’ll go ahead and I’ll execute the first command over here. So basically this is going ahead and installing all the relevant packages which would be required for Sam to be working correctly. Perfect.

So now all the packages seems to be installed perfectly. So one thing that I’d like to do is I’ll jump to the third command which is Pip installed. Basically, this is installing the AWS Sam CLI package. So I’ll go ahead and I clear the screen and I’ll install this package. Perfect. So my package has been installed successfully in case at certain amount of time you get a certain error related to pick. So you can run the third command, the second command in fact, which is python 27 hyphen devil depending upon the python version that you have. So this is just an optional in case it does not work. So I’ll just tag it as optional here so it does not confuse. Perfect.

So now that we have the basic things installed, we’ll look into the two files. One is index JS. So this is basically a node JS based hello world type of function. So this is something that you generally use in your lambda. And the next important part that we were referring to into Sam is the template YML. So this is a template of Sam. Now, if you will see one additional parameter that you would generally not find is the transform. So this is the transform of serverless type and within the cloud formation resource type you have AWS serverless function. You have handler as index handler.

So this is generally referred to your handlers that you define within the code and the runtime is node. Just that we have already specified. You have already also specified the memory size which is 120 MB as well as the timeout. So these are the two important part that you will need to build your Sam code. So let’s do one thing. I’ll copy the template YML and within my server I create a directory call as Sam. Let me go to the Sam directory. I’ll say template YML and I’ll copy the code here. Along with that I’ll copy one more code which is index JS, which is my function. Let me copy it here and I’ll save this. So now we have two files. One is index JSON.

Second is template YML. Now, if you look into the PowerPoint presentation, the first step was creating a Sam template which was basically a JSON or YAML that defines the lambda API or others. Now, we already have a YAML template and we have a lambda function which it is referring to. Next part is that we have to test upload and deploy application using Sam CLI. Now, one part to remember is that whenever you deploy, it goes to the S three bucket. So the deployment package is stored in the S three bucket and from there it is referred via the cloud formation. So let’s do one thing. I have a commands TXT. So these are the three commands that we’ll be executing.

Now, the first command that we need to execute is the creation of a s three bucket. So this is very important because once you package, because once you do the deployment, this will be in form of package which gets stored in s three. So let’s create a simple S three based bucket first. So the first thing that you need to do is you will have to basically create a sample s three bucket. So we can quickly do AWS s three. I do a MB, which stands for make bucket and I’ll run KP Labsam. So this is the S three bucket that we want to create and it has basically created the KP Labs S three bucket. Now, the deployment package will be stored within this specific S three bucket. Now, the next thing that we need to do is basically we have to package the application.

So basically you do a Sam package, you refer the template file which is template YML, you refer the output template file which is serverless YML and you reference the S three bucket where your package will be stored. So I’ll quickly copy this up, let me paste it. But before that I’ll just quickly show you unless there are only two files which are available right now. So once you basically execute this command, you will see that it is uploading a package to an S three bucket and it has also created a serverless Hyphen output YML. So let’s quickly look into the S three bucket. So this is my S three bucket KP latcham and you would see that I have a package which is already uploaded. So this package is uploaded through Sam. Now, the next thing that you need to do is that you’ll have to deploy this specific package. So first step was that you create a package where you reference to template YML and the output template file is serverless iPhone output YML. Now this serverless iPhone output YML. If we quickly open this up so this is the cloud formation template.

So here one thing that has really changed is the code Uri. So here you have the code Uri with specific SV bucket and it also contains the name of the package which is supposed to be deployed. So basically it is like converting to a cloud formation template that is done within this specific file. So once we have this file, the next thing that we need to do is we have to deploy the actual cloud formation stack so you can have this command. So as it is saying, it is asking us to execute the following command to deploy the package template file. However, we’ll be doing the recommended way which is the Sam deploy one. So you have Sam deployed, you have the template file which is serverless YML, you have the stack name and you have the capabilities which is capability im.

So let me quickly run this specific command and I’ll press Enter. So capability IAM is quite important to remember because otherwise your function will fail to deploy. So what it is doing is that it is creating a cloud formation stack. So if we can go to the cloud formation so you’ll see stack call as Kplives Sam hyphen demo is being created. So let’s just quickly wait for a minute. Perfect. So it says successfully created updated the stack which is Kplabs hyphen sam hyphen demo. So now if we click a refresh, it says create complete. Now ideally what should happen now is that our lambda should have a function that we had defined. So you would see that there is a Kplabsm hyphen demo world function. So this is the function which got created through the cloud formation stack.

So do remember, at least for the exams, that for Sam specific packaging, there are two important steps that you’ll need to do. First is you have to package the actual file. So this package goes to S three, and once you have the S three package available, you do a Sam deployed based on the serverless iPhone output YML file file which comes from the first command. So this serverless hyphen output YML basically contains cloud formation compatible YAML file which contains the link which is a code uri to the essay package which needs to be deployed.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img