Amazon AWS Certified SysOps Administrator Associate – Databases for SysOps Part 7

  • By
  • May 27, 2023
0 Comment

13. [SAA/DVA] Aurora Hands On

So let’s create an Aura database and we are in the new interface. So there was an old interface and now we can switch back to the old one, but I’ll keep the new so that the video is more compatible with you. And so we’re going to create an Aura database so we can do a standard create to configure everything, or an easy create, but obviously we want to configure everything. So we’ll go into a standard create. I’ll choose Aura and then you have to choose either if you want Aura with MySQL compatibility or PostgreSQL compatibility. So these are the only two modes you can have Aura with. So we’ll choose MySQL because it has more options. But whether you choose MySQL or Postgres, you can see there’s a version drop down and you can choose a version you want.

Now, for this hands on, I’m going to use MySQL because it has the most features for Aurora to demonstrate and then for the version I’m going to use 5. 6 point ten A. And the reason I do so is that if you look in between here, for example, we have a database location, regional, global but if I select next version, for example, if I select this one, I don’t have that feature anymore. So I don’t demonstrate as much. So if you want to follow along with me, remember by the way that this is not a free hands on, okay? This is something you’ll have to pay because Aura is not part of the feature but just to see the options you can follow along.

So choose Aura MySQL 5610 A just to have an option on the zebra based location. So, regarding the database location, you can either have a regional aura database within a single region, or you can choose a global aura database in multiple AOS regions, in which case the rights are going to be replicated to your other regions within 1 second. And there is a way for you, in case there is a regional outage, to failover to a different region by separating the different region as its own cluster. So we’ll keep it as regional for now because it also shows us a lot of cool features we can get out of Aura. So here we have to choose the database features and we can see there’s four different modes we can use.

Either we have one writer and multiple readers, which is the one I explained to you, which is the most appropriate for general purpose type of workloads. But we can also have one writer and multiple reader parallel query to analyze to improve the performance of analytics. Queries you have multiple writers where you can have multiple writers at the same time in Aura this is when you have a lot of writes happening continuously and finally you have serverless, which is when you don’t know how you will need Aura. You have an unpredictable workload, maybe you need a little bit in the morning, maybe you need a lot at night. And so you need to be more scalable, in which case you would choose serverless and this would be a great option. So regarding the exam, the one you should know definitely are going to be the general one and the serverless one. Okay, so we’ll go into the general one configuration because there is more configurations to do than the serverless.

So for the general one, we can go into either production or dev test. And these are like templates prefilled that will fill the settings in the bottom. So I’ll choose production and we’ll go one by one. So in terms of the DB Identifier, you can call it whatever you want. I’ll call it my aura DB. And then if I scroll down the master username, I’m going to use something that I know. For example, I’m going to use Stefan. And then for the password, I’ll use password just like before. So password and here password. Okay, great. So next I’m going to scroll down and we have the DB instance size. So this is where you choose the performance of your database. So you can choose either memory optimized classes with R and X classes. So you can see all these instances in this drop down or we can have burstable instances that are going to be cheaper and it includes a T classes.

So a T two Small is going to be my cheapest option right now for this demo. So that’s what I’ll be choosing. But as you can see, based on the workload you’re having, if you have a production type workload, memory optimize is definitely going to be better. If you’re doing dev and test DBT, too small is going to be probably a better option with the most cost saving but still not a free tutorial to do this one. Okay, now let’s talk about availability and durability. So we can create Aura read Replica or Reader Node in a different AZ, which is great because if an AZ is down, then we can fail over to a different High Availability Zone and that gives us High Availability.

And so this is why it says multiaz deployment. So we can create one or don’t create one, regardless if we do. So the storage itself is across multiple AZ. That’s a feature of Aurora, but it’s more about getting your instances of Aura into across multiple AZ. And if you want a multi AZ deployment, then please enable this and I will keep it as because it’s a good option, but obviously a more expensive one then for connectivity. So where do you want to deploy your aura cluster in your VPC? And then what do you want to have in terms of subnet? You want it to be publicly accessible, yes or no? I’ll leave it as no, we won’t connect to it.

And then you want a default security group or create a new one. It’s up to you to choose whatever you want. This is not important, we won’t connect to this database anyway. I just want to show you the options. And finally you have a lot of additional configurations. Okay, so the DB instance identifier, you could for example, have it as this one, this is great. The initial database may be Aurora and then you could specify parameter groups. These are not in scope for the exam. You can define a preference for failover, but we won’t do that. Backup is really great if you want to have snapshots your database and you want to restore from them.

So they’re great for disaster recovery. So you can set up the retention you want for your backup between one day and 35 days. Okay, then encryption. So do you want your data to be secure and encrypted with Kms? And this is a great option if you want to make sure that your data is not accessible by anyone, even AWS. So encryption is maybe something you want to enable.

And then finally backtrack issues allowing to use to actually go back in time for your database. So if you did some bad commits, some bad transactions, you can go back in time, which is a nice feature, we won’t enable it right now. Monitoring to monitor the database with an enhanced monitoring with a high granularity and finally the log exports and so on. So as you can see, there’s a lot of different options. Finally, maintenance for the maintenance windows and the upgrades of the version which are very similar to what we get in RDS normal. And finally, the last setting is deletion protection to ensure that we don’t delete this database mistakenly by just right clicking saying delete.

Now we have an extra step to make sure that we don’t do that. So when we’re ready and we’ve seen all the options, so from an exam perspective, again, the very important ones are going to be around multiaz, it’s going to be around the fact that you can have one writer and multiple readers or serverless. These are going to be the important points of aura. Okay, so when we’re ready, we’ll just create the database and here we go. Okay, so it took a bit of time, but my aura cluster has now created. So as you can see, we have a regional instance of aura and we have a writer database and a reader database. So remember, the writers and the readers are separate.

So I’m going to click on this aura database to get a bit more details. And as we can see, we have two endpoints here, we have a writer endpoint and then we have a reader endpoint and we know it because it says minus ro here, which means read only. Okay, so this is recommended to use the writer endpoint to write to aura and to use a reader endpoint to read from aura, regardless of how many databases you have. But if you wanted to you could click on this database itself and get the endpoint to connect to it. But this is not recommended. What the recommended way and what the exam will test you on is that you should select an endpoint that is either the writer endpoint to write or the reader endpoint to read. Okay? You can have lots of options in here.

We won’t go over them. We have seen the main ones. Lastly, we can have a bit of fun and we can on the top right, either add a reader so I have to cross region rate replica, create a clone, or add Replica auto scaling to give us some aspect of elasticity. So say my scaling aura. And then you could select, for example, a target CPU utilization of 60% for your scaling, which looks a lot like what we had for auto scaling groups. And we could also specify additional configuration through the coolant period, the scaling and so on. And finally the Min and the max capacity. We’ll leave it as is and add the policy. And all of a sudden we have added auto scaling to our Aura database. That was really simple. And now we have a fully functional Aura database. So before finishing this hands on, if you did create a database with me, please make sure to delete it so you don’t spend some money.

So to do so, you click on this one and you delete it. You delete this one instance. So you type delete me. And to do the same, you have to do the same with the reader endpoint. So actions delete and then say Delete me. And this can take a bit of time. Okay, so now if I refresh, I can see my database and have zero instances, but you completely delete it. I cannot do it right now because deletion protection is on. So I click on Modify and then at the very bottom of this page, I’m going to disable deletion protection. I click on Continue, and then I will apply this immediately to make sure that I do have disabled my deletion protection. And so now if I click on my database and do Actions, I’m able to delete it. And I do want to take one final snapshot. No, I’m fine. And then I won’t recover my data. That’s fine. And I’ll delete the database cluster and I’m done. So that’s it for Aura. I hope you liked it and I will see you in the next lecture.

14. Aurora Backups

Now let’s talk about backups, backtracking and restores for Aura. So there are some automatic backups and you can have a retention period between one and 35 days. They cannot be disabled. And this allows you to do point in time recovery. So pitr to restore your database within five minutes of the current time. If you do a restore, it’s going to restore to a new database cluster. So automatic backups, just like RDS s, if you restore, they restore to a new database. Now you have aura backtracking, and this is used to rewind the database back and forth in time up to 72 hours. Now, backtracking is different from backups because backtracking does not create a new cluster. It is an in place restore.

And currently it supports Aura MySQL only. And then finally you have Aura database cloning, in which case you’re going to create a new database cluster that will use the same DB cluster volume as the original cluster in the beginning, but then it’s going to use the copy on write protocol, which is that as the data is being written, it’s going to be copied to a new volume. And so that makes your Aura database cloning super quick and super easy.

And so when do you use the database cloning? Will use the database cloning when you want to create a test environment using your production data, for example, it’s one click and there you go. You can access a fully new environment, which is quite handy. Okay, so that’s it for this theory lectures. Just remember that backups are restored to a new database cluster, backtracking is used to do in place restores, and database cloning is a very easy way for you to get started with the second database cluster using the same data, originating data. So that’s it for this lecture. I hope you liked it and I will see you in the next lecture.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img