CompTIA Linux+ XK0-005 – Unit 08 – System Maintenance Part 1

  • By
  • August 7, 2023
0 Comment

1. System Maintenance

All right. So in this unit we’re going to talk about system maintenance. Our goal here is to make sure that we are preparing for the worst things to happen and that is the way we’re going to start it off with kind of that negative tone. We’re going to say, look, we need to back up our data, know how we’re going to restore it, have a plan and be ready should there ever be a disaster or anything that causes us to have to bring back our server information. We’re going to look at ways in maintenance to get rid of the mundane, daily task, weekly task, whatever it is.

We’re going to talk about how you can schedule tasks so that things run by themselves in the background. We’re going to look at a couple of ways of doing that. And then other parts of our maintenance means being a little proactive, that is, monitoring the system performance, even I hope, creating a baseline so you know how it works, how it works in a good environment. And then that way when people complain that it’s bad, you can see the differences. And another proactive part is logging.

Now this is the accounting part of the AAA and another unit I talked about authentication authorization and accounting. So now we’re going to talk about logging. And it’s more than just user activity. We’re going to be logging a lot of information again, hopefully to be proactive in maintaining our systems before they crash rather than using this information in what we call a postmortem analysis. So that’s our goal in system maintenance is to cover all of those areas and show you what utilities and facilities you have available to you in Linux.

2. Topic A: Backup and Restore

All right, so in our first section here, we’re going to talk about the backup and restore. Now, I want you to think of a couple of things about what we’re going to do with backup and restore beyond just, oh, I’m backing up my data so I can restore it later. We actually have to make some plans to have our backup and our restore meet our business needs, and I hope to discuss that even more as we go through. So what I’m saying right now will make sense in just a few more moments.

But the goal of the program is to try to eliminate downtime. Remember that backing up systems can cause downtime, and of course, how long it takes to restore a system can also affect how long systems are down. So keep that in mind as we talk about the plans, because you’re going to have to come up with a backup and restore plan that meets the best that you can for the business needs, for keeping things online while doing the backup or recovering from a disaster or a failure with your restore.

3. Compression Part

Now, one of the first things that we look at when it comes to manipulating files and backup is this idea of compression. Now, compression used to be more important than it is today, and that is for storing files on hard drives. The goal of compression was to save space. I mean, that’s the whole purpose. You make this file smaller on the hard drive. Now, we said it was more important in the old days, because in the old days, the amount of storage we had on hard drives was not as plenty as it is today. We were happy and excited when we had I remember having an 800 megabyte hard drive on was just going through the roof. It was exciting having the first gigabyte drive. Ten gigs, 100 gigs. And today, I think everybody spoiled.

You go out by 1. 5 terabytes, two terabytes, and you do so for just a couple of $100 or less. So compression not as important when it comes to storing files on hard drives. We have a lot of space. We do see compression, though, as a great means of trying to take care of the bandwidth utilization when transferring files over those traditional slow, wide area networks. But today, when I sit there and see reports about having OC 768 connections from Chicago over to New York City, then I’m saying again, well, compression was probably something of the past. Now we really talk about compression in phone traffic. Voice over IP doing this compression of the RTP traffic on the wan again, just to make the total latency less so. It’s just crazy. The terms and use of compression are changing.

4. Compression Part

Now that I’ve told you all of that, we’ll talk about what compression is. So it is what they call a mathematical transformation of data. Now, that just simply says that I’m going to take a bit of data, however much it is, and find a way mathematically to reduce its size. The most common example I can give you is that a compression algorithm looks for patterns of data inside of your bulk amount of information and comes up with a representation of that bulk amount of data and then does a substitution. So in other words, they make a little chart, a little legend that says if I put one in there, that really stands for this big bulk of data.

So anytime now that you look at this big bulk and you want to decompress and you see a one use this as its replacement, it’s kind of a substitution. Now, that is an ultimately simplistic representation of compression. It’s really much more complex than that, but it gets the point across that we are really replacing frequently seen bulks of data with smaller representations so that we can actually shrink the file. All right, now, there are other types of compression out there. In fact, there are many different kinds, and they have different methods about how they go on and do that compression. When we talk about compression algorithms, we have a term that we use called a symmetric algorithm or an asymmetric.

Now, the symmetric algorithm just simply says that the amount of work or time it takes to compress a file and decompress the file are roughly the same. So if I just use fake numbers here as an example, if it takes 5 seconds to compress the file and it takes roughly 5 seconds to decompress it, then that would be a symmetric algorithm. Now, the asymmetric algorithms is one where the time to compress is very much different than the time to decompress. So maybe it takes 10 seconds to compress a file, but only 1 second to decompress, or vice versa. That would be an asymmetric algorithm.

Another part of the descriptions of these algorithms are whether they’re lossless or lossy. Now, a lossless means that when I compress the file and I go through the decompression, the resultant decompressed file is exactly the same as the file that I started with. And a lossy algorithm, which usually gives you much better compression when you decompress the file that comes out, isn’t necessarily the exact form is how it started. Now, I’m not saying that it went through a Star Trek transporter beam and got mutilated, but it’s not the same file. So that might make a decision about the types of compression you want to use.

5. Algorithms Part

The other thing we look at with a compression algorithm is what we call the compression ratio. In other words, how much do we get as a benefit of storage savings by using a particular algorithm? So if I had an algorithm that said it did a five to one ratio of compression, that would mean that if I had a ten meg file after compression, it’s only two megs in size. So that’s actually be a good savings. If I did that to a number of different files, that’s a good savings. Now, it’s also important to know that your compression ratio is somewhat tied to the algorithm, but it’s also tied to the type of file that you’re compressing. Some files just don’t compress very well.

There could be a number of different graphics or videos or data types or other types of files that perhaps just don’t have power patterns that are capable of being found and going through the compression process. Obviously, if you compress a file, you’re probably not going to expect a lot if you try to compress a compressed file even more. So the ratio is going to be different depending on the type of file and the type of algorithm. But that’s one of the ways that we look at how good the compression is, is how much did it save us?

6. Algorithms Part

Now, there are a number of different algorithms I’m going to talk about or give you the names of a few of them, just as an example of what we mean by lots of different ways of doing compression. Originally there was this one called the Huffman coding. Now, all I’m going to tell you about these is not how the algorithm works. We have other things we could talk about. But just to let you know that it was considered a lossless type of compression, but it’s rarely used. Another one that you might have heard about is the RLE or the Run length encoding. Again, a lossless type of algorithm. It’s very old, we don’t necessarily use it so much. There was the length of Ziv and other variations. Another lossless protocol, it used small chunks of data in the way in which it would go through the compression with the Burrows Wheeler.

It was another lossless protocol and it sorted blocks of data. The discrete cosine transform. The DCT was an example of a very lossy method of doing compression wavelet. Again, another lossy method of being able to do the compression. And having said that, you’re no better off knowing which one do you want to use or which program do you buy uses which algorithm, because it’s just difficult to go through and say here’s how each and every one of it actually mathematically does it. And there’s many more algorithms besides the one I’ve just told you. So that’s where doing a little bit of research in the choices that you use, or just sticking with the traditional algorithms that are already part of the Linux operating system might be your best bet.

7. Compressed File Types

Now there are many different types of compressed files that you’ll see, and they have different extensions that we’re used to seeing in a Windows environment. Remember that in the world of Linux, we don’t need extensions. We often put extensions, or what appears to be extensions at the end of files, so that we know by looking at the name of it that it was a compressed file. So some of the things that you might see might be like zip. That to us means it’s been zipped up the GNU zip or gzip. Sometimes you see it as an extension of just GZ. There’s bZIP. You might even just have a dot z your pictures, your GIF file or jiff file. I’ve heard it pronounced either way is a compression type of picture.

PNG, JPEG, MPEG, MP3, S are all compressed file types that you’re going to be encountering. They all use different algorithms, algorithms that are probably designed to work best with the type of data that it’s representing. For instance, JPEGs work well with graphical types of image information, all of those little RGB pixels that it’s storing, whereas zip or GNU zip, g, zip might be great for data files. Again, different types of compression algorithms, we might see them as different types of file types or extensions if we were running Windows. But in Linux, like I said, we often put these in there as extensions or as part of the file name just so that we can know what it’s doing by looking at it.

8. Compression Utilities

So in the world of Linux, there are some commands, some programs that you can use to compress your files. The first one has the obvious name compressed. It’s an older Unix compression utility and when it was done, it would create a Z file. Again, that was just the way we represented the algorithm. Gzip or GNU. Zip is generally the replacement that we now use over compress. It creates a file that you often would see ending in GZ and to unzip it you would use the gun. Zip is what I call it. It’s the G Unzip or the GNU Unzip Program? Another common program is called the bZIP that would create what we see as a BZ two file.

Because the program was bZIP two, these are included with Linux and there is nothing wrong with those algorithms. Most likely you’re going to be doing everything with good new zip. That seems to be universally a common utility that is used in anything Linux. Now what you’ll see is when you download files, almost every file you download probably ends in a GZ, indicating to you that you just downloaded a compressed file. The goal is again, making the transfer even faster and then you can uncompress it and then work with it on your system.

9. GNU zip = gzip

All right, so the new zip or Gzip, as I said, that’s your replacement to compress, and it like many other utilities, has a number of different options. So when you type in the command gzip and you put in the name of the file, you then choose the options that make sense for you. Well, obviously one of the big ones is if you download a file that ends in that GZ and you’re saying, oh, it’s Zipped, I want to decompress it. The option would be D or decompress. You can also put in a dash number sign to figure out the type or compression level you want, knowing that the higher compression level you ask for may cause it to take longer to actually compress.

Like so many other algorithms, it has a dash F to force something to happen, ignoring any error codes. You can ask it to test a file with A-T-A-V verbose means that while you’re doing the job, keep me up to date, show me all that information, and then you have the ganu unzip or the g unzip that I like to say gun zip, because that’s what it looks like. It says that unzips those files as well. So anyway, you’ve got these options that work with gzip, the GNU zip, ganu unzip, and the more you know about the options, the more choice you’re going to have and how to compress or decompress your files.

10. bzip

Now the bZIP two algorithm or utility gives you a higher compression ratio than the GNU Zip, but it’s not as popular. Again, it also has options. It has a bZIP two, a b unzip two option. Again, you just simply know the options include the name of the file and then let it go and do its job. Now the reason it’s not as popular may just simply be that it hasn’t caught on. Not every version of Linux may have bZIP two, which means that if I transfer a file to you that’s been compressed with this algorithm, you might not be able to unlock it. So it’s not helpful and so it’s kind of back to the availability. But I think at some point we might see this becoming more and more popular. Right now it’s still the new Zip as our biggest choice. But if you really are looking for compression ratios that are better, take a look at bZIP two.

11. Compression via the GUI

Now, if you want to avoid the command line and I’ve been making fun of this throughout the entire course because I am one of those old dinosaurs, command line was all I had. I remember my only having monochromatic screens with green print on a black background. Some of you were lucky enough to have some that had the white print. Anyway, you can, if you wanted to in the GUI, right click the file and simply say that you want to zip it. And you can even choose the type of algorithm that you want to use to zip up that file or to compress it and be done. It is that easy. You don’t have to go through all of the work on the command line. But again, the command line is kind of cool. As I said before, when we get into scripting or some other types of things that we want to deal with that make life a little more automated for us.

12. Demo – Compressing and Extracting Data

All right, we are going to do a little bit of work here with compressing and extracting our data. So what I’m going to do, verify I’m in my home directory. Now I see, look at that. I’m not. So first of all let’s get back into my Trainer blog and CD to my home directory. And now it’s PWD. There we go. Now I’m back into where I need to be. I’m going to clear this all off and we’re going to start that over. So here we go. We’re going to make a directory and we’re going to call it the backup test. Hopefully that reminded you of all those commands. It was a little flashback. I’d like to say I did that on purpose to make you remember those commands. Now we’re going to CD to this new directory backup test that we just made and LS, and there’s nothing there.

Okay, shouldn’t be because we didn’t put anything there. We just made a new directory. Now we’re going to execute a command that needs us to be the root. So we’re going to use the pseudo, we’re going to copy everything in the VAR log that is named star log into this current directory, my home directory, the Tildebackup test. Now the reason I need to do this as the route is that the permissions for those have to come from the root. You can’t do that as a regular user. So I have to have the right permissions to do that copying information. And then we’re going to change the security on those to seven, seven, seven for everything in there. Again, pseudo because I don’t have permission to do that now.

But it remembered who I was this time was I put in the password and remember I earlier changed the sudoer’s file. So actually as Trainer I can execute these. Now let’s do an Lslf and I have some files in there so that’s good. I’ve got some log files, I’ve got some stuff in that backup test. And the whole reason we did that was so that we could start going through there and zipping up some files. In fact we’re going to zip up the auth log file. Now what I want you to see is that it’s 26 985 right now. So let’s, let’s go in there and run the gano zip auth log. Hit enter. This is that same listing command. And now look, it’s got a new extension GZ. That means it’s been zipped. It was 26 985.

Now its size is 28 48. So that is a huge, almost ten to one compression that we got off of that one file. All right. So whatever you zip you ought to be able to unzip. So I like the name of this the way you would pronounce it, the gun zip. It’s actually the GNU Unzip but it’s more fun to say gun zip. All right, auth log GZ is the name of the new file. Bam. Hit that in. Go back to the listing, and it’s back to 26 985. So certainly was a lossless type of option that we have. Okay, so now that we’ve done that, let’s be more creative. Let’s bZIP everything that we have. bZIP with an asterisk means I want all of them backed up to the LS LF, and I’ve now got all of those things compressed.

I mean, if you look at these sizes of files compared to the sizes they were up here, I’ve got them all zipped. And look at that. They’ve got that extension BZ, too. So, again, I’ve done all of the compressions of those, and now we can unzip. So there’s a number of different types of programs. Unzip everything. Boom. And we’re back to normal with the LS LF anyway. So lots of options when it comes to the different types of algorithms for doing those compressions. And hopefully, that seemed pretty straightforward. You saw the zip and the gzip, and they both gave you quite a good amount of compression, and they both appeared to be lossless in that they brought them back to their original sizes.

Comments
* The most recent comment are at the top

Interesting posts

The Growing Demand for IT Certifications in the Fintech Industry

The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »

CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared

In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »

The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?

If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »

SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification

As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »

CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?

The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »

Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?

The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »

img