MCPA MuleSoft Certified Platform Architect Level 1 – Non-Functional Requirements of APIs Part 2

  • By
  • April 29, 2023
0 Comment

11. Client ID-based API policies

Hi. In this lecture, let us discuss about the API clients and the client credentials significance in some more detail. So these client credentials actually play a role in some of the APA policies. Okay? So when the policies are being enforced on the APA, some of the policies will actually need need the client credentials. And based on that, they decide, okay. How the policy works. All right, so before we discuss about such policies and what are all the policies that depend on the client credentials, let us see how these client credentials work in the first place.

Okay, so like I mentioned in the previous lecture, the API consumers who get access on the APIs needs to feed in these client credentials as part of the API request. So they can do this in many ways. I will quickly show you on a small demonstration. You will get better understanding by looking at the short demonstration. So let’s take it step by step. First thing, we have our API running in the runtime manager. Correct? This is the actual API implementation. So this is our EXP sales order API. Correct.

So let us now first try to completely ignore the API manager and see how this particular runtime manager API application can be executed. So this is where my application is running. And the cloud hub has provided me the app URL. So what I’m going to do is I’ll copy this app URL, okay. And I’ll configure it or my postman. All right, let us go to the post, man. So I’ll just create a new collection. I’ll name it as sales order and I’ll create a new request. I know my create API is of type post.

I’ll put in my URL that I have copied. And I know the resource name is create for the create salesforce. Okay, so what else? Once I have this, let’s just try to hit it without anything. It will error out, I’m sure. But let’s see. Yeah, it’s saying it’s unsupported media type. Why? Because we have declared that this APA accepts application slash JSON format. Correct. That is the supported media type. So what I’m going to do, I will change the body type to say JSON, but I’ll pass empty JSON. What happens now?

Now it’s complaining that it’s a bad request because the request is not matching the schema or the RAML definition. Actually, you can see what is the actual error. If you go to the runtime manager back and see the logs of that particular API, you can see that this complaining the required header transaction ID is not specified. Remember, we have enforced the trites on the API to pass in transaction ID, client ID and client secret. Right. So let us now go and do that. Okay? So what I’ll do is go to the headers and I’ll say transaction ID. Same like my API console type. I will give the same one client ID SCID. Let me give it now. It still complaining. Bad request probably this time because of this particular format, the actual body format. Let’s actually go and look into the brawl.

So if you see the logs now, it is complaining about some missing required fields like Customer ID and all, because remember, we had a format like Customer ID. Even if I give customer ID next, it will complain about the shipment location and line items and all the other things it’s saying. All these things are missing. So let me quickly put in all those details. All right? I given some dummy request which satisfies the conditions and required elements, and I’ll try to hit again this time.

Okay? See, now I got the proper response back. If you remember, this was a hard coded response in my API in the implementation, correct? So this is working well. So the thing you have to notice here is it is as per the request format, Ramble definition and all. But if you see the client ID and secretary given are some dummy values. Okay, but still the APA is working. Why? Because we are setting the APA implementation directly. And in the APA implementation and the RAML specification, we have just enforced that the client ID and secret should be present. But there is no validation logic or anything, right? We are not validating this against any client credentials or client provider identity provider. Right. So these are just dummy values.

As long as they’re present, the trite will be satisfied and we go ahead request will go ahead and try to process the request, and we get the response back. Okay? So now what we’ll do is we’ll take it to the next step, wherein we will now go and try to enforce the client credentials on this particular API instance in the API manager. All right? Let us go to the APA manager. Now. What I’m going to do. I’ll open my APA instance. I’ll go to policies. I’ll add a new policy. I’ll select Client ID enforcement now, and I’ll say configure policy. So here there are many ways how an API client can present their client credentials. Okay? So once they get the request approved and they get the access on the API, we know they’ll be presented with the client credentials, right? Client identity, client secret.

So there are many ways how the API client can now send these to the API manager via the request. There are. One is the using Http basic authentication header. So how Basic Authentication header works is the credentials will come like a username and a password manner. Okay? So client ID would act as username and client secret would act as a password, just like how we type username password for basic authority. Same way. So if you want to see it in the post manner. So how they would be sending it is while calling or making this request, instead of passing the details here, the details would not come in the header instead they can pass as a basic authentication and the client ID would go into the username section and client pass secret will go as a password section.

This is one way and the second way is as a custom expression where it does not mean we can give any expression. This expression can be some other places how we can retrieve the client ID and client secret and the two other possible are apart from basic authentication. Either they can come in the query parameters as a client ID, client secret can come in the query parameters of the API or they can come in the Http headers. Okay, so you already saw the Http header mechanism correct so I’ll change it back to nowhere.

The Http header is just like how we are accepting or sending it now. Okay, this is how even we have enforced the trites as well. So this is an element with our rights. So this will be our final approach. But just for the knowledge sake, apart from the header way, you can also have expression to say instead of extracting the IDN secret from the header, we would like to extract it from the query parameters. Like the same things we can go and pass from the query parameters so that they will come as part of the URL here. Okay, so it’s up to the preference.

Generally these days nobody is passing through the query parameters. People prefer passing in the headers or authentication but the option is still there. Okay, so let us go back and enable our Http header mechanism now because we want the Http header mechanism, I am going to change this or modify this expression to read from the headers which means I would have to just remove this particular Http query params and restall will fall in line. Okay, so let me do that now. Yeah, only thing is I would have to change this particular thing to hyphen because I have named my attribute as client hyphen ID in the RAML. Right, so let us do that and I’ll also change this to client hyphen secret.

Okay, so what happens now in the runtime when the APA client hits the request with these attributes in the Http header? This policy, because we filled in the parameters with these expressions will extract the client ID and the client secret from the Http header using these expressions. Okay? And here we are applying for the entire APA for now. So let me apply this particular policy. Okay, so now once I apply the policy, it will be added to the APA proxy which is in front of the APA implementation.

All right? Generally sometimes it will be very quick, sometimes it takes up to 30 seconds or something but let’s anyway go ahead and give a try. Okay, so I’ll go to the postman again and let us try with the same combination one more time. All right, see now it did not work like the last time. Now the AP is comparing that it’s an invalid client, okay? Last time it accepted whatever we have passed in in the client ID secret. But now it’s complaining. Why? Because we enforced the policy. This time, the APA manager takes the responsibility and tries to validate those client credentials, ID and secret, whether they’re valid or not, which are obviously not valid because there are no such credentials available in the system.

So let us now go back to one of the consumers who requested the access, which is POS, right? So let’s go there and take the credentials. Okay, let’s take the POS credentials for example. I will copy this and try to put in the client ID and I will go back again to the client secret. I’ll copy this and again go back to the postman and change it with this particular secret. Okay? I’ll just save my request and I’ll try to hit again and see if it works. See, it’s working now.

So, because they are presented with the right credentials, these are validated. And because they’re valid, my APA proxy allowed the policy successful. And APA proxy allowed the request to go to the APA implementation and it deserved the response back. Okay? So now this is where the client credentials play the role in the request. Now, let us move on to the last part where I was telling you that I will explain there are some policies which play key role, right? So among those, the first one is the client ID enforcement, okay? Which you’re just seen anyway. Now, so the client ID enforcement is the one which needs the client ID credentials, obviously valid them. So you already know that that is one of the policies which needs the client credentials. Then the second one is the OAuth policy.

Okay? So even OAuth one works a similar way. But instead of the client cardian client secret being passed in the postman or the APA client side in the header or query parameter basic authorization, generally a token will be sent, right? But in order to get the token in the first place, an Auth token from the identity provider client provider. The client should present those client credentials to the client provider client provider. And then the provider, by validating those client credentials, will return back a was token. And that token needs to be passed in the API client request. Just like how the client credentials are passed in the previous demonstration.

Okay? Now, when the Wash token comes, what happens is again API manager will take the Auth token and present that token back to the client provider to ask if this token is valid. That time again, that client provider decrypts or parses this was token and validates and recognize whether this is outcome of a particular valid client credentials or not. Or is it expired at all. If it is not expired, if it is valid, then the request again goes forward to the AP implementation, executes the logic and gets the response back. Okay, so this is one such policy as well. And some other policies are especially the SLA based ones, okay? Because again the client credentials are tied up with the SLA tires.

Remember, because while requesting the access, the consumers have to choose the SLA tire. Which means the client credentials will be linked to the tire. Depending on that tire, only some of the SLA based policies will work. You know the difference, right? Like I explained, if it is a non SLA based policy regulate limiting the policy works in common to all the APA consumers. Okay, respective of what. But if the SLA based then the policy behavior will change.

So only it will check. The system will check. Okay, for this SLA tire consumer, what is the quota or whatever is the restriction and if that only breached for that particular SLA the tire, then the policy will work accordingly. Otherwise it will still reject and all. Okay, so the SLA based ones and the and the client ID enforcement are the policies which actually are client ID based. They require client ID credentials compulsory in order to work. But IP Blacklisting. IP Whitelisting.

That production and all. They don’t have to depend on the client ID secret because they are depending on other parameters like that production are depending on the request incoming request like JSON or XML IP blacklisting depends on the source IP address, right? So they work differently. Okay, so this is about the client IDE credentials in detail. Happy learning.

12. HTTP Caching API policy

Hi. In this lecture, let us discuss about another important APA policy, which is http caching policy. So this http caching is a Quality of Service based APA policy. So you may get a doubt here. Like we have seen all the categories of the APA policies on the any point API manager in some of the previous lectures, right? And we have seen under the Quality of Service some other different policies, but not the Http Caching, correct? So the reason for that was we have missed this particular policy during that time because the API instance that we have created was mule three. Okay? We have remember there were checkbox asking at the time of API instance creation whether is it a mule four based API or a mule three based API. And we have chosen, we have not opted that option, which means we agreed some mule three or less than mule for application.

So this Http Caching policy is applicable only for the APIs, which are mule four and yabo. So that is the reason this particular APA policy was not listed in the QoS category quality of Service category on the APA manager in our previous lecture. Okay? So this is the only one we missed at that time. Do not worry, the other policies and all are common for both mule four and the mule three runtimes. Okay? So what is this http caching APA policies about? So this is a policy that performs the server side Caching. Okay? So when I say server side Caching, it is not on the APA implementation side again, meaning it does not go and alter any of the APA implementation code to achieve this Caching.

Okay, whatever is the proxy enforcement site, be it be a basic endpoint or be it be a proxy with the endpoint in any of these places, the APA policy kicks in right before the AP implementation, right? So at the time of enforcement only, this cache will be maintained as well. Okay, so the cache is maintained at the site of the APA policy enforcement. Whether it is a proxy or the directly embedded one, it doesn’t matter, but not on the AP implementation, meaning it doesn’t go and touch or change anything with the AP implementation. Okay, so you may get a doubt we can implement Caching, the AP implementation logic itself, right? Yes, that is also possible.

But if it was not considered during the development phase, say in the development of the API, if it is not considered the caching and the way how the code was developed and all but if later, at late stage of the project or even in production of course, we cannot imagine that in production has to be first tested in the lower environment. All good. But I’m saying we identified that there is a scope of the chance of applying cash on a particular already existing interface, and we do not also want to change anything in the code level or we are hesitating to change in the code level, then this Http caching APA policy can help wherever applicable.

Okay, so let us see what all this policy can do. All right, so this API policy caches the entire Http responses, but it depends on the key that is being set. When we say cache, it doesn’t mean that it blindly caches, okay, all the responses that comes in for every you know how caching works, right? It’s like a key value pair for a given key. It caches a value against it so that next time when a request comes in carrying the same combination of key or details, then the same response will be served back from the cache instead of actually going and executing the back end application or APA implementation.

Okay, so what all caches? This APA policy caches full Http response what? Including the Http response code, response headers, everything like it’s like a copy, whole copy of response Http response will be cached. Okay? But only limitation is that it can cache only a maximum of one MB, okay? The size cannot exceed one MB. So you have to kind of take care of that. And there is also certification cushion as well that comes around this. So please remember this part. Okay, so it caches, but the size limit of the cached rattp responses is currently one MB, okay? And this caching is only performed for Http requests for which configurable expression evaluates to true.

Okay? So that is also one more option, meaning like I told just a few seconds back, right? It is not that all Http responses will be cached. You will have to have a key combination like I told, like a few seconds back. So, which means when it’s a key combination, what this API policy provisions is we can configure data wave expression in the parameters of this policy saying okay, if my request key matches this expression and if it evaluates to true, okay? Only if the expression turns out to true or whatever expression only, then please go ahead and cache the response. Okay? And if it is not evaluating to true, then the request will be like a normal API request.

It comes and hits the AP implementation no matter what and gets the response back from actual implementation only without having to get anything from the cache. Okay, so this is how we can configure the key and the expression using database and it works. The API policy works. But again, there is something this expression tests apart from just evaluating the expression we give in the key.

Yes, of course that is has to be evaluated true also the implicit one meaning what the expression we give is explicit one telling, okay, the request has to satisfy this expression only when it is true, cache it. But even if it evaluates to true, there are some more implicit expressions that any point APA Manager evaluates to see if it has to cache or not. So such expressions are the Http method should be get our head only for this Caching to work. So again, this may there is a certification question with regards to this evaluation of expression so please keep this in mind.

Okay? So for the Http caching to actually work, the two things should be true. One is that expression should be evaluated to true whatever expression we give, okay? And even if it evaluates true, the second thing that should satisfy is that the Http method should be Get or head only, okay? If the other type of request for APIs come in for post, et cetera. And all this cannot work only for retrieves like Get and head. Okay? So please remember that. So this is how the APA works and you need to mostly concentrate on the aspects of this particular APA policy, okay? Now, this API Caching policy http Caching APA policy also has some other features along with just not Caching, okay? Because Caching is one part fine, we have cached it, but as long as we are using that cache devalue, we are happy.

But there might be scenarios where we want to actually go and remove the elements from the cache as well, right, like any other Caching solution. So how can we do that? There can be two requirements. One is we want to arrange or remove a particular response from the cache or we want to remove or clear entire cache. So how this works in this policy is this particular policy respects the Http request header keywords. So respect of the mule softer any point platform http itself as a protocol has a request header called Invalidate and Invalidate hyphen all.

Okay? So if we use these Invalidate and Invalidate All headers while calling the request or passing the request of the particular Get, then they will be used to perform to clear the cache entries from the APA Caching policy. Okay? So if you give Invalidate it will just clear the one particular cache. Meaning if you pass the Get request or Head request with the same combination of key which evaluation true, but with the extra header called Invalidate then instead of actually getting it from the cache it Invalidate. The APA manager will see this Invalidate keyword in the header and it will clear the cache entry for that particular request. Key combination goes to the APA implementation and gets a fresh response from the API implementation. And if you instead of pass Invalidate all then it will clear entire cache and there, on any subsequent request will start caching again.

Okay? So what all the parameters that this API policy takes hrtp Caching APA Policy takes? Because for almost all other policies we see they take some or other parameters, right? So what this API policy takes are first number one, a key which is a database expression like we discussed before, which has to be available to true, but it is not mandatory to give this key. If we do not give this key, then the default thing that will be used as key is the request path. Okay? So it’s a get or head URL, right? So it will be the request path like we have seen in the previous demo, our request path is Urlapi create, so that slashapi create will be used as a default key if you don’t give one and it will always set it as evaluation.

True, if you don’t have an expression, but remember that request path, it will not consider the query parameters. So if you put something like after APCREATE, sorry, create is a bad example because it’s a post method, but let’s for demonstration or topic sake, let’s take it retrieve, okay? So if it is slashapa retrieve and if you put a question mark sales order number equal to ABCD and all the API policy for sure TP caching will not consider the query parameters, it only consider the APA retrieve part only, okay? As a default key. But if we give an expression data even, then that will be taken as a key and DVD will try to check if it is available. True or not? This is the first parameter.

Second parameter is the size of the cache, which means the number of entries that it has to cash because the pay can be hit by many consumers and there can be n number of hits, right? So we can limit saying, okay, the number of entries max that can be cached or some x number, okay? So that when that entries are exhausted, then the Eviction policy and all will kick in, okay? And the third parameter is the time to leave parameter which decides how long the cache entry will reside in the cache and when it has to expire, okay? That is the time to live. And the next one is whether we want this to be persistent, meaning even if it has expired, time to leave a number of entries, et cetera. Limitation that persistent means, let’s say if there is a restart happening in the middle, okay, the AP implementation is restarted, something like that.

Then if you want to persist your cache entries across the restarts, then we have the option to make a persistence with the combination of using the Cloud Hub Object Store. Okay? So this is how the http caching API works. So like I said, there will be some two to three questions that may come. I’m not telling you about a single exam, you’ll get all three questions in a single exam.

But there are three possible for possible questions that may come across different combination of questions definitely from this Caching topic, okay? One is that it actually considers only get and get head methods for the Caching. And second one is the max is one MB, it can cache. And the third one is like it considers only the request path excluding the query parameters and all. Okay, so hope you understood this and I hope it comes handy for you in some of your real projects. We are using this in my current project and it helps a lot. And this is the time of development for implementing Caching and all in the implementation layer. Okay, happy learning.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img