Google Associate Cloud Engineer – Authentication and Authorization in Google Cloud with Cloud IAM Part 4

  • By
  • August 16, 2023
0 Comment

10. Step 07 – Exploring Service Account Use Cases

Back in the step. Let’s look at a few use cases for service accounts. Let’s start with a simple use case. We have a VM and we want it to talk to Cloud storage. We already looked at this use case in the previous step. So how do we do that? We create a service account role with the right permissions. Next, we assign the service account role to the VM instance when it’s starting up. The great thing about this is this uses Google Cloud managed keys. Key generation and use are automatically handled by IAM when we assign a service instance to the key. Whenever this instance needs to talk to cloud storage, what happens is a key is generated and sent along with the request and the cloud storage would authenticate the key and then provide access to the instance.

All this process is automatically handled behind the screens for us. The keys are also automatically rotated as needed. Big advantage of using this approach is that you don’t really need any passwords. So you don’t need any password to access cloud storage at all. Therefore you don’t really need to store any credentials in things like config files. The recommendation is not to delete the service accounts used by running instances. If you have an instance and if you have a service account attached with it, do not delete that service account. If you delete the service account, the applications running on those instances will lose access to whatever resources that are provided access through the service account. The next use case that we’d be talking about is on prem to Cloud Storage. And what we are talking about over here is a long lived permission.

So I would not be able to connect an on prem machine to cloud storage. What is the difference? Until we have been talking about resources which are present in Google Cloud, this server is present outside Google Cloud. It’s present on premises. Problem with the onpremises machine is that you cannot assign a service account directly to an onprem app. So what we would do is we would create something called keys. So create a service account with write permissions. Create a Service account user managed key. How do you do that if you go over to the Service accounts? So I can go into service account and service accounts. I am an admin over here in Service Accounts. Let’s pick up one of the service accounts once it’s available. Yep, this is my service account for Compute that we created earlier.

And over here if you go in, you can see that I can add a key so I can add so I can create a new key. So there are two formats in which you can create keys jason’s or P twelve JSON is the recommended format. You can create a key and you can use the key to authenticate yourself as this service account to the specific service that you would want to access. So you can create a service account user managed key. We saw how we can do it from the console. You can also do it from the command prompt. You can also do it from the command line using Gcloud. So gcloud IAM service Accounts keys create so for service accounts, we would want to create a key. You can download the service account key file, make sure that you keep it really really secure. It can be used to impersonate the service account, so keep it secure.

Number two, you cannot regenerate the same key again. So if you lose it, you have to come in here and generate a new key again. Once you download the service account file, you can make it available to your application so which your application needs to access cloud storage. You can actually make the key file accessible to that specific application. One of the easiest ways of making the key file accessible to your application is by setting an environment variable Google Application Credentials. So set an environment variable with this name and set it to the path of the key file. Export Google Applications credentials is equal to path to the key file inside your application. The recommended best practice is to use Google Cloud Client libraries. The Google Cloud Client. Libraries use a library called Application Default Credentials.

If anywhere you see application default credentials or ADC, it is related to the credentials that the application uses to authenticate themselves to any resources in Google Cloud. Whenever you are using Google Cloud Client Libraries, they by default make use of the application default credentials. And what does the application default credentials do? It uses the service account key file which is pointed to by the environment variable. So if you have an environment variable with the name Google Application Credentials, then the ADC would make use of the file to authenticate itself to a service. So this is the second use case where we are talking about how you can authenticate from an on prem machine to cloud storage for long lived authentication.

Once you put the key file in there, the application will be making use of the key file for a long time to connect to the service cloud storage. One of the best practices is to rotate the keys once in a while. After every few months, make sure that you actually create a new key and use it in the application. The last service account use case that we will be talking about is on prem to Google Cloud APIs, however this time for shortlived connections. So you don’t want to make calls from outside GCP to Google Cloud APIs with shortlived permissions. You don’t want permissions for few hours or shorter. Why do you want shortlived permissions? Because this is less risk to sharing service account keys. If you create a service account key and share it, then that’s long lived until the service account key is active, somebody can use that and access the service for short permissions.

There are few credential types which are recommended over two access tokens open ID, Connect ID tokens and self signed JWT. Let’s consider a few examples. When a member needs elevated permissions, he can assume the service account role. So let’s say I’m a member, I have certain set of permissions, but let’s say I would want to perform a privileged operation. In those kind of situations, what we can do is we can create a service account with access to perform the privileged operation and the member can assume the service account role. And when we assume the service account role, a or 2. 0 access token is created. The great thing is this would give us with temporary access only. The next option is Open ID Connect ID tokens.

This is recommended for service to service authentications. We want to authenticate a service to another service. However, for short period of time in those kind of situations, open ID connect it tokens are recommended. In this step we looked at a few scenarios for using service accounts. We looked at the connection from virtual machine to a cloud storage. We talked about on prem to cloud cloud storage for long lived connection. And we talked about on prem to Google cloud APS for short lived connections. I’m sure you’re having a wonderful time and I’ll see you in the next step.

11. Step 08 – Scenarios – Service Accounts

Welcome back. In this step let’s talk about a few scenarios related to service account. Let’s get started. Application on a VM wants to talk to a cloud storage bucket. What is the recommended option? Configure the VM to use a service account with the right permissions. Application on a VM wants to talk to a message actually wants to put a message on the Pub sub topic configure the VM to use a service account with the right permissions. It’s the same thing except that the permissions in here would be to the cloud storage bucket over here the permissions would be to the Pub sub topic. Is service account an identity or a resource? Actually it is both. You can attach roles with service account so it’s an identity. You can let other members access a service account by granting them a role on the service account so other members can assume a service account as well and in that terms it’s a resource.

So a service account is both an identity and a resource. Let’s quickly look at the last scenario VM instance with default service account in project A needs to access cloud storage bucket in project B what you can do is in project B, add the service account from project A and assign storage view of permission on the bucket. So the idea in here is the resource which needs to be accessed is in project B. So in the project B what we are doing is we are giving the service account from project A access to the bucket which is present in project B. In this big step we looked at some of the important scenarios with respect to service accounts. I’m sure you’re having a wonderful time and I’ll see you in the next step.

12. Step 09 – Exploring Cloud Storage – ACL (Access Control Lists)

Come back in step, we’ll talk about something called Access Control List. Why do we need access Control lists? Access Control List defines who has access to your buckets and objects, as well as what level of access they have. Now you might be wondering how is this different from IAM? Whenever you are using IAM, IAM permissions apply uniformly to all objects within a bucket. If I’m saying somebody has read access on the bucket, he has read access on all objects inside the bucket. However, what if you want to customize specific access to different objects in the same bucket? In those kind of situations, you can go for ACLs. These are called access control lists. Important thing to remember is IAM and ACL. If a user has access either through Im or through ACL, he would get access. So it’s either or.

Recommendation is use IAM for common permissions to all objects in a bucket and use ACLs if you need to customize access to individual objects. Now that we understood what is ACL and how it is different from Im, let’s see how to use ACLs in cloud storage. So let’s look at how you can perform access control in cloud storage. How do you control access to objects in cloud storage bucket? There are two types of access controls. One is Uniform uniform bucket level access and this is implemented using IAM. However, you can also say I want to go fine grained. I want to use both IAM and ACLs to control access. In this kind of scenario, you can have both bucket level and individual object level permissions. As we discussed earlier, uniform access or the IAM access can be used when all users have same level of access across all objects in a bucket.

You’d go for fine grained access with ACLs when you need to customize the access at an object level. Let’s see these in action. So let’s go to cloud storage, not just the storage. Wait for it to load and then pick up the bucket that we created earlier. My first bucket in 28 minutes. Now, if you go to permissions, you can see the access control. The access control can be either fine grained, which is object level ACLs are enabled. You can also switch to Uniform. When you are switching to Uniform, let’s select Uniform and let’s say Save. When you say Uniform, no object level ACLs are enabled. So if you’re using Uniform, then it’s Uniform access across the IAM. You cannot go to individual object and control the ACLs for them.

However, if you switch to fine grain, let’s go to fine grain. Now you’d be able to customize object level ACLs. Now, I’m at fine grain, which is object level ACLs enabled. And over here, let’s go back to objects. And I can configure at an object level. Let’s say I would want to pick up the index HTML and at the index HTML level I can say I would want to edit permissions and you can say who can access this specific object so you can configure individual object level permissions. In this step, we’ll look at the two types of access controls in Cloud Storage Uniform, which uses IAM fine Grained, which uses a combination of IAM and ACLs. Let’s look at it a little bit more in the subsequent steps. I’ll see you in the next step.

13. Step 10 – Exploring Cloud Storage – Signed URLs

Welcome back. In this step, let’s look at an important concept in cloud storage called Signed URL. You want to allow access to a user for limited time to your objects, and you don’t want users to have any Google accounts or things like that. In those kind of situations, you can go for a Signed URL functionality. A signed URL is one that gives you permissions for a limited time to perform specific actions. You can easily create a signed URL by creating a key for your service account. So you can create a service account with the desired permissions, and you can create a key for the service account, and then you can use the key to create a signed URL. For example, this is the command So gsutil signed URL.

How much time do you want to allow access? So duration is ten minutes, and you can use your key. And which object in the bucket do you want to allow access to? So this is how you can actually create a simple Signed URL. In this quick step, we looked at why we need a Signed URL to provide a user with limited time access to your objects. We also saw how you can create one. All that you need to do is to execute a simple command GSU till sign URL. You can specify the duration, and you need a key to the Service account with the right access to the objects. I’ll see you in the next step.

14. Step 11 – Exposing a Public Website using Cloud Storage

Welcome back. In this step let’s look at how you can expose a static website using cloud storage. Earlier we uploaded a number of objects to our bucket and these objects altogether can form a simple website. What we want to do is to expose it as a public website. The first step is to create a bucket with the same name as the website site name. Over here my website name would start with my first bucket in 28 minutes. So that’s fine. However, if you are actually exposing it on a specific domain so let’s say if you want to expose it on in 28 minutes or let’s say Google. com, then your bucket name should match that of the website name. So the name of the bucket should match the DNS name of the website.

The next thing that would happen if you are using a custom domain is a verification will be done to ensure that the domain is owned by you. We don’t really need to worry about it right now. Fair enough. The second step is to copy the files to the bucket. We already did that earlier. When we created the bucket we dragged and drop all the files in here. We went to the downloads that are provided along with the course and in here in the downloads is where we have the cloud storage and we dragged all these files and dropped them in here and therefore we have all the files ready. You can also add index and error HTML files for better user experience. Index is the first page which will be loaded.

We already have an index HTML in here. We don’t really have an error HTML but we do have an index HTML. Now the next important thing that we want to do is to give permissions to everybody to public to access this bucket and therefore what we would do is we would give Storage Object Viewer so we are giving you permissions on the bucket to all users. All users includes authenticated and unauthenticated users. So everybody enough can access your cloud storage bucket. How can we do that? The way we can do that is by going to permissions. Let’s not worry about fine grained permissions. What I would do is I would switch to uniform and I’ll say uniform and say save and over here I would go into and all.

If you type in all you’d get all users and all authenticated users. I would want to provide it to all users. For all users we want to provide a role. What is the role that we want to provide? Storage object viewer. So I am providing Storage Object Viewer role to all users on this specific resource. On this specific resource. What we are creating in here is a simple binding and I would say save. Are you sure you want to make this resource public in this specific page? What we need to do is to select allow public access. So let’s do that. Allow public access. And now the policy would be updated. And over here you can see public to Internet. So it says one or more permissions grant access to anyone on Internet.

Now, what I would do is I would go to index HTML, which is present in here. So let’s go to index HTML. Let’s get the URL. So the public URL is present in here. Right now. You can see that there is a storage Google Apis. com public URL which is present right now. Anyone with this link can access the object on the public internet. So let’s go in and let’s copy the link and open it up in a new window. So you can see that you can see the page. So the HTML page, along with all the images which are present in there, are in there as well. The same thing will work even if I open up an unauthenticated window. So if I’m opening up a window where I’m not authenticated yet and paste the URL in, even there you would see that I’m able to see the entire page come up properly.

In the step we looked at how you can actually expose a static website from the content present in cloud storage. We saw that it was really, really easy. The important things to remember in this step are the bucket should be created with the same name as the website name. This is if you are using a custom domain. After that, you would copy the files to the bucket and assign all users viewer permission or a viewer role on the bucket. I’m sure you’re having an interesting time and I’ll see you next. An extend.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img