DP-300 Microsoft Azure Database – Implement compliance controls for sensitive data
1. 34. apply a data classification strategy
Now is some of your data sensitive, but people might still need to be able to see it. So sensitive data includes data privacy, regulatory and national requirements. So we’ve talked extensively about GDPR and the California version. And maybe there are security scenarios including controlling access. So what you can do is say this particular column is a bit sensitive. So to do that we go into the database this time. So if it’s due with data, we’re going to the database.
And under security we’ve got data discovery and classification, and I’m going to go into the classification tab. So here we can see that nothing has been classified. However, there are 25 columns with classification recommendations. So for instance, maybe the employee first name is confidential under GDPR. So whatever you choose, just check them and accept selected recommendations and then save. But maybe I want to do a more manual one. Well, that’s fine, you just go to add a classification.
So you select the schema, the table name and the column name, and then you select the information account. So there’s NA and other there’s networking, there’s personal data such as contact info, name, national ID, Social Security number, health and date of birth. There’s credentials and there’s financial records such as credit card, banking and financial. So I’m going to put this one in as contact info. And then we have a sensitivity label. So NA not sensitive at all. It’s data from your own scope. It’s completely fine if that is viewed by anybody.
Public is freely available business data or information that’s been released to the public. General business data not meant for the public, such as emails, documents and files which do not contain confidential information and then confidential, highly confidential. And the GDPR version, data that would cause harm or extensive harm to your company if overshad. Now, just to point out, you can’t select NA for both the information type and the sensitivity label.
One or more of them are required. So I’m going to say that this is contact info with a general sensitivity. Now, the following roles can modify and read a database’s data classification, owner, Contributor and SQL Security Manager. Additionally, the following can read a databases data classification but not modify reader and user access Administrator so I’ll add this and save. Now, once they’ve been classified, you can use audit to drill into security insights, access to sensitive data and so on.
And you will also see it in what’s called Intelligent insights. Now, let’s manage the classifications elsewhere. Let’s have a look at TSQL. So we have Sys sensitivity underscore classifications, so that shows you your classification. So we’ve got these information types and we’ve also got information about what they are. So this may need a bit of explaining, but if I go into Sys columns and look where the object ID is equal to this long number, for instance, then you’ll see that this first one is employee first name and the last one is employee last name. So you can see where the object ID and the column ID match up. Suppose I wanted to add is sensitivity Classification.
Well, I can do that. Here we’ve add sensitivity classification to name, so that’s the schema name, the table name and the column name separated dots and then with and in brackets the label. So in this case highly confidential information type, so networking, contact info, credentials, credit card, banking, other name national is SSN, that Social Security number, health and date of birth. So we’ve seen those already. And then the rank low, medium, high, critical or non.
So if I put this on the postal code, as I’ve already put one on the city and run, that you can see run successfully. And if I go back into my portal, instead of three classified columns we have four. And now you can see something in the overview, you can see the label distribution in terms of confidential, general, highly confidential and what type, the information distribution type. And then if you no longer want it, then you can drop the sensitivity. So I will drop it again for the postal cord. And when we’ve gone from four you will see we got down to three. So this is the data classification strategy. So you can do this in SSMS, you can also do this in the portal. So you can see there are various suggestions and you can always add a new classification with an information type and a sensitivity label.
2. 35. configure server and database audits
In this video, we’re going to configure server and database audits. Why would you want to do that? Well, you can use auditing to retain a trail of selected database actions, report on database activities using preconfigured reports on the dashboard, and analyze reports for suspicious events, unusual activity, and trends. Now please note, it’s not supported for premium storage, storage or hierarchical namespace. We want to encounter hierarchical namespace in this particular course. For example, it’s using things like Data lake storage, gen two. Now, if Azure database is really under high activity, then Azure prioritize other actions and so it might not record some audited events. And they are written to append Blobs in Blob storage. So let’s talk about server policy audits. This is for all existing and newly created databases. Server policy audits always apply to the database regardless of any database level auditing policies you may have. They can sit side by side, so you can do server policy audits as well as specific database level auditing policies.
Now, Microsoft recommends using only server level auditing unless you want to audit different event types or different event categories for a specific database. The default auditing policy includes Batch completed Group. That’s for all queries installed procedures, and successful database and failed database authentication group. So that’s success and failed logins and it stores around 4000 characters of data in an audit. So let’s add auditing to our SQL database. So I’m going to go in the server to start with. And down the left hand side we have got auditing under the security heading. So I’m going to enable Azure SQL auditing. So as you can see, it will audit all the queries installed procedures executed against the database, as well as successful and failed logins. Now, where are you going to store all of your auditing? So we could have an existing or new storage account, an existing Monitor Log Analytics workspace, and or an existing event hub.
So you have the choice of options. So you might want to try one or two of them and see where you use them. Now, in the server area, you can also enable auditing of Microsoft support operations if we’re in the database. So I’ll go to a specific database and go down to auditing on the left hand side. Then you don’t have that option, but you can view the server settings if you so wish to do so. So that gets you something similar to what we’ve got in the server. So we have the choice. So let’s start, for instance, with the monitor log analytics workspace. Unfortunately, there isn’t a plus create new, so we’ve got to create a new one. So I’m going to go into a separate tab and type in Log Analytics. And there we have our workspace. So I’ll create a log.
Analytics workspace. So a resource group. So Log Analytics you can see at the top it says A workspace is the basic management unit of logs. So let’s go down and give it a name. So this is log analytics and region. Again, I’m going to stick to near where I am, UK South. Click on the pricing tier. We have a pair as you go pricing tier, but that is actually quite a good one. You won’t incur any charges until you’ve got a sufficient amount of data. So let’s create and review this. So after quite a while it validated it.
So I’ll click on Create and it starts creating. Now let’s go back while it’s creating and have a look at Event Hub. So event hub similar things. You would need to set up a stream to consume these events and write them to a target. Now alternatively, you might just want to have it in a storage. So I go pair as you go. And here we do have a click create new. So I’m going to have a storage here. So this is DP 300 storage account, but you can see can only contain lowercase letters and numbers. So let’s try this again. So, going to use a general purpose storage, standard performance locally, redundant storage and all the rest of it is fine. So I’m creating that at the same time.
So we’ve got all of these things happening. So what I’m going to do now is that deployment is now complete. So I’m going to go back to my DP 300 server, go back into auditing so I can enable Azure SQL auditing and click on Log Analytics. And there is my Log Analytics that I have previously set up. And I’m going to click on Save and you can see it’s now saving the auditing settings.
We’ve also now got this storage account set up. So I’m going to add the storage account once the save operation has finished. And now I will add this storage as well. So I’ll save that. So what I’m going to do now is pause this video, record quite a number of more videos and then come back here so that we’ve actually got events that I’ve done and that we can then audit and see what I did. Well, it’s a few days later and I’ve got server level auditing enabled, but database auditing not enabled. So I’m here in my SQL database and I’m going to click on view audit logs.
So there’s nothing in the database audit because I don’t have one. But in the server audit here you can see things like batch completed, RPC completed. So that’s a remote procedure call so you can look at all of those. Now what you can also do is go to SSMS and I can go to File open and merge audit files. So I’ll click on Add. I will add from Azure Blob storage I’ll need to connect. So I will go sign in to Azure storage. So select my storage account or my Blob container. Notice. I’ve got volubility assessment. I’ve enabled this in a few videos time from here.
So you can also use exactly the same way to merge vulnerability assessment files, select my server, select my database and select the from and to. So I’m going to have multiple files downloading and then it’s going to merge them all together and I can view them in SSMS as well. So with my audit files once they are created, I can just go to audit and then I can view the audit records. I can also go into Log Analytics if I had got it enabled. And here you can see the results from that. So you can enable SQL auditing at the server layer and also at the database layer. So these audits apply at the same time. So you can have different database policy audits for server audits, but Microsoft recommends using only server level auditing unless you want to audit different event types for a specific database. So you can get batch completed, group all queries and store procedures, you can get successful database and failed database authentication groups or logins and you can save it to an existing or new storage account, an existing monitor, log analytics workspace and or an existing event hub.
3. 36. implement data change tracking – Change Tracking
In this video we’re going to have a look at data change tracking. Now, change tracking is separate from change data capture which we’ll have a look at in a few videos time. So CT is different from CDC. Additionally, while both change tracking and change data capture can be used in Azure SQL database, the only one that can be used in an Azure SQL managed instance is change data capture. Now, what does change tracking do? Well, let’s suppose that you have a table like Saleslt address. Now suppose a row gets changed or a particular column gets changed. Well, that’s what change tracking tracks. So for instance, I could tell you that this row, this column has changed. However, it doesn’t track how many times something gets changed, neither does it track historic data.
So I couldn’t go back and say what it was changed from. So it is more lightweight and requires less storage than a feature that would do all of that. CDC change Data Capture so what is it used for? Well, it enables applications to determine which roles have changed and then only request the new rows. So that can save a lot of time. When you open up an app, it doesn’t have to reload the entire database, just things which have changed. The data is stored in an in memory raw store and flushed on every checkpoint to the internal data. In other words, it’s kept in memory and then saved every so often. Now, you might want to consider using snapshot isolation for this, so that any changes which are being made while getting the data are not visible within the transaction.
So it has actually a set of data and not a changing set of data. So you can do that with either of these. So this will alter the database to allow snapshot isolation on. And only when allow snapshot isolation is on can you set the transaction level to be snapshot and then you can have your transactions. So it may sound a good idea. Change Transaction so how do you actually implement it? Well, if we go to the database in SSMS right? And click and go to Properties, there is on the left hand side a section for change tracking. So at the moment it is false indicates whether change tracking is enabled for the database. So I’m going to change that to true. You can then set the retention period. So two days is the default, so you don’t have to have an endless list. Change information will be retained for at least this time period. The minimum is 1 minute, there is no maximum.
And you can also select whether data is auto cleaned up in that retention period. If it’s true, then change tracking data will be removed periodically. So if an app has not got the updated information in time, all data would need to be refreshed. So that takes advantage of this retention period. If false, however, change tracking information will not be removed and will continue to grow. So I’m going to say that’s absolutely true. Now, there is an alternative in TSQL. So alter database name a database, set change underscore tracking equals on and then in brackets change underscore retention equals what time period? Two days. Comma auto underscore cleanup equals on.
So that enables track changing for a particular database. But I’ve not enabled it on any tables yet, so we will need to do that as well. So if I go into the address table here, we can see right and click on it go to properties again. On the left hand side, we’ve got a track changing section. So here I can set the track changing to be true for a particular table. So we need it on for the database first. Now we can also set the track columns updated to true. So this is useful if you use the update command, otherwise it won’t be tracking updates. So you can say on or off true or false. Again, you can use SSMS for this. Alter Table name of Table enable change Tracking with Track column updated equals on or off. So when you’ve had enough of CT’s, if that’s the case of change tracking, you can disable it. You need to disable it on all of the tables before you can disable it on the database.
So it’s very easy to do in SSMS. Again, you just right and click on the table and then the database code to change tracking and change this from true to false. In TSQL, you can see the code on the screen. So disable change tracking as opposed to enable it, and then set change tracking equals off for the database. Right, so now that’s done, what can you get? Well, first of all, you can see whether your databases is enabled for change tracking. So here we can see that database five, and we can always find out what database five is from sys databases. So database five is the DP 300 database. So that is on with a two day retention and auto cleanup is on.
We can also see what tables are being tracked. And again, I think we might need help from maybe objects, sys objects or sys tables, so we can see that object ID 1778, et cetera is on. So let’s find 1778. There it is. And we can see that that is the address table. So is track changes updated on? Yes, that’s absolutely fine. Now how would we use it in real life? Now, you probably won’t need this for the exam, but first of all, you’ll get the initial sync version, so you’ll get where it’s synchronized up to. So in this case, if I view it, we can use change underscore tracking underscore current underscore version. Right, let’s do some changes. So I’m just going to edit these top 200 rows, and I’m going to change the very top row from 8713 Yosemite to 8712. So the last sync was zero.
So now if we have a look for what the current version is, the current version is version one, so we can find out what’s happened since version zero. So there we are. We can see that there is an update operation and this sys change columns will tell you which particular column has been updated. However, Microsoft says you shouldn’t use this directly. Instead it recommends using another function called change underscore fracking is column in mask. And this is when it gets a bit complicated. So I’m not going to go much deeper into that, but hopefully you can see the principle.
So it is telling us that one particular column has changed and when we drill down, we will see that it is address line one. Now this last bit, now I’ve said that it may flush to disk which is perfectly okay, but in addition it can auto clean up. So what that means is that after a specified retention period it will disappear. So what change tracking min valid version is saying is okay, we are currently on a particular tracking number. This is the tracking number that is the earliest for the retained information that we’ve got. So the information that’s not being auto cleaned up. So if the last sync is later than or equal to this, then it’s fine. We don’t have to refresh the entire table.
However, if the last sync, let’s say the last last time we synchronized was version 80 and the earliest we can get back to is version 90, then we’ll have to refresh the entire table because we don’t know what happened in sync 8182-8384 and so on. So this is change tracking or data change tracking. So you enable it for a database and then you enable it for as many tables as you want and look at the information which you can get out. You can get the primary key, you can’t get other things.
So if I try putting in CT address line one, it’s not going to work. You can get what sort of operation it was. So in this case an update. You can see with a bit more work which particular columns were changed, but you can’t see what it was changed from and to. So if that’s sufficient. It is fairly lightweight, definitely compared it to change data capture and it requires less storage. So this is how you can do change tracking. Just right and click on a database on a table and go to change tracking on the left hand side.
Interesting posts
The Growing Demand for IT Certifications in the Fintech Industry
The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »
CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared
In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »
The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?
If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »
SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification
As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »
CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?
The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »
Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?
The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »