• Home
  • Splunk
  • SPLK-2002 Splunk Enterprise Certified Architect Dumps

Pass Your Splunk SPLK-2002 Exam Easy!

100% Real Splunk SPLK-2002 Exam Questions & Answers, Accurate & Verified By IT Experts

Instant Download, Free Fast Updates, 99.6% Pass Rate

SPLK-2002 Premium Bundle

$74.99

Splunk SPLK-2002 Premium Bundle

SPLK-2002 Premium File: 90 Questions & Answers

Last Update: Nov 12, 2024

SPLK-2002 Training Course: 80 Video Lectures

SPLK-2002 Bundle gives you unlimited access to "SPLK-2002" files. However, this does not replace the need for a .vce exam simulator. To download VCE exam simulator click here
Splunk SPLK-2002 Premium Bundle
Splunk SPLK-2002 Premium Bundle

SPLK-2002 Premium File: 90 Questions & Answers

Last Update: Nov 12, 2024

SPLK-2002 Training Course: 80 Video Lectures

$74.99

SPLK-2002 Bundle gives you unlimited access to "SPLK-2002" files. However, this does not replace the need for a .vce exam simulator. To download your .vce exam simulator click here

Splunk SPLK-2002 Exam Screenshots

Splunk SPLK-2002 Practice Test Questions in VCE Format

File Votes Size Date
File
Splunk.testking.SPLK-2002.v2024-10-15.by.teddy.50q.vce
Votes
1
Size
69.8 KB
Date
Oct 16, 2024
File
Splunk.selftestengine.SPLK-2002.v2020-07-24.by.liuwei.49q.vce
Votes
2
Size
68.47 KB
Date
Jul 24, 2020
File
Splunk.Test-king.SPLK-2002.v2019-10-22.by.Bat.43q.vce
Votes
4
Size
61.09 KB
Date
Oct 27, 2019

Splunk SPLK-2002 Practice Test Questions, Exam Dumps

Splunk SPLK-2002 Splunk Enterprise Certified Architect exam dumps vce, practice test questions, study guide & video training course to study and pass quickly and easily. Splunk SPLK-2002 Splunk Enterprise Certified Architect exam dumps & practice test questions and answers. You need avanset vce exam simulator in order to study the Splunk SPLK-2002 certification exam dumps & Splunk SPLK-2002 practice test questions in vce format.

Getting started with Splunk

1. Importing Data to Splunk

Hey everyone and welcome back. Now that we have Splunk up and running, we can go forward with the hands-on and look into some of the amazing features that Plank has to offer. Now, if you basically see over here, Splunk really has a lot of options, and throughout the video series, I will be exploring these options. However, to begin with, we'll go ahead and add some sample data so that we can begin searching and understand various search-specific aspects. So in order to do that, you have to go to settings and click on Add Data. You can skip this, and you need to click on "upload." Now, before you click on Upload, you basically need to have a sample of data. Now I do have two files over here, so basically this is the sample data that we will be exploring right now. So I'll be uploading both of these files within Splunk. So in order to do that, let me click on Upload. You can click on select File and youcan upload one of the data here. So I click on access. I'll select open. I will go next. You can now give it access underscore combined within the source type. So, if you type Access underscore Combined, you will see that an option has appeared. You can go ahead and click here and go to Next. You can just leave everything at default and go ahead and click on Submit. Similarly what we'll do we'll add one more data. So I'll click on upload. Go to "Select File." This time we'll select a file called Secure. I'll go to Next, I'll go ahead, and I'll submit Next, I'llSo now we have both the files uploaded. So now if I just click on the Splunk Enterprise logo, it will take me to the default page. So now you can click on Search and Reporting. So Search and Reporting is where you can search all of your data, and you can go ahead and do all kinds of fancy stuff. So now, if you'll see, there are a total of 130 events, and if you click on "Data Summary," you will see that there are two sources oneis Access Log and second is Secure Log. Now, within the source type, you would see that there are two source types: one is Access Combined, and the other is Linux underscore Secure. We will be discussing source types in the upcoming video. So now, before we go further, I would quickly like to show you what exactly these files contain. So let me quickly open the access file with my item editor, and basically this is the Apache (or you can say NGINX) web server access log, which contains various details related to the requests that are being made by the clients. So let me do one thing: I'll copy a sample request here and let me open up a new text document. I'll say it's dead, and I'll pace the sample request here. So basically, whatever log file that you put in a log monitoring system, the reason why you use a log monitoring system is to get some kind of statistical information about whether a web server is working or if there are any security breaches that are happening, how many failed authentication attempts are happening, and so on. So from this log file, if I have to have meaningful information, the first thing I have to do is parse this specific log file. So since we are just taking an example of a single line, let's parse this manually. So I'll just divide these fields into multiple lines. Let me divide it into multiple lines. So now, basically, you would say, "Let me just remove the dash dash." There are basically five. There's one more line. So basically, there are six major lines that are created. So the first one is basically the client IP address. So this is the client IP address. Now, this is the timestamp. So I would say this is the timestamp. Now, this is the method. So there can be various HTTP methods. You have Get, Post, and a few other options. So this is an HTTP method. All right, now this 200 that you see over here, in fact, is the HTTP response code. So depending upon whether the request that you make was successful or not, whether it failed or access was denied, the HTTP response code changes. Now, for the next section, this is essentially the number of transferred packets, or bytes. next field is the referee, and the last field is the user agent. So now this one line that we have over here, after I parse it, becomes much easier for me to understand. So now I can see, okay, this specific request has come from a specific IP address. This IP address may now belong to Russia. It can belong to us. So, for example, if I want to see how many requests to my websites are coming from Russia, then a log monitoring system should be able to parse the log file in such a way that each feed is meaningful. This is very, very important. All right, so now let's go to Splump. I'll go to the data summary, and I'll go to the sources. And let me click on the access log. So this is the access log that we were discussing about.Now, if you see over here, Splunk has extracted a lot of information from the log file. Let me select this. Let me click here, and you will see that Splunk actually extracted a huge amount of information. Now one of the information that if you'llsee it has extracted is the client IP. So if we match here, it has extracted the client IP. Now, the next thing that it has extracted is the method. So this is a GET request. It has also extracted a request site. So this is the time stamp, this is the status, this is the URI, and this is the user agent. So you will see, Splunk has done all of these things for us, and since Splunk has extracted this field, the searching and reporting become extremely simple. So let me quickly give you one example. So now let's say a business has come to you with a requirement to give information that contains a list of IP addresses that are visiting your website. So all I have to do is do a stat count, but nowhere do I have to put the field that I want to look for. So the field is client IP. So if I quickly do a status count by client IP, you will see Splunk will only extract that field. So what Splunk will do is only extract the field that contains the client IP value. So this is also called parsing. So any log that you upload to Splunk, if you want meaningful information, always needs to get passed. If it does not get passed, it will not really have a proper meaning. So this is very important. So the next thing I quickly wanted to show you before we conclude this video is the second log file that we had uploaded. So if I go to the data summary and I go to the source, I go to the secure log, and this is the timestamp. I'll select all time here, and you'll see there are a total of 9800 events. If I open this, you will see that the log file contains the following information: a failed password for an invalid user application server from this specific IP on port 3551 SSH 2. So let me do one thing. I'll copy this file and I'll put it in my notepad. I'll mark it as secure, and let's try to manually parse this specific file. So, as we already know, in order to parse it, we have to derive meaningful information from it. So the first field is basically the timestamp, the second is the host name, the third is basically the service name, the fourth is basically the message, and in this specific field, it basically contains the IP address from which a failed authentication attempt was made. So if I say timestampfeed, you can call it hostname feed. This is the process name, which is the SSHD process. Now there is the message. Now the message basically contains APTO. So this app store is basically the username, then it contains the IP address from which this failed authentication attempt was made. So one example was that in one of the organisations that I was working with, I had recently joined and saw that one of the servers had more than 10,000 failed attempts from an IP address that belonged to Russia. So many good memories Then we went ahead, investigated, and it came out that the firewall was open for all. We'll discuss those interesting aspects, so next, is the client typing all right? So if these fields are passed, then only you will have some meaningful information now. If I look into this plunge, what are the fields that it is passing? It is passing PID, which is five to seven and a half; it is passing the process name, which is SSHD, all right? And so basically it is passing SSHD processing, and it is passing 5; 7; 6; it is not really passing in things like IP addresses from which the failed authentication attempt was made, so in one use case, let's assume that I want to see from which IP address the failed authentication attempts were made. How will I find that? Now I'll be able to find out only after these logs are passed, so currently these logs are not passed, and thus a lot of information I won't be able to look forward to, and this is why one of the most important aspects of a log monitoring system is that the logs should be passed, so with this we'll compute the video, and in the next video we'll look into how exactly we can make sure that these specific logs are passed.

2. Security Use-Case - Finding Attack Vectors

Hey everyone and welcome back. back. In today's video,video, we will lookinto how into how we can parse this specific WarlockSecureWarlockSecure file,file, which contains the authentication attempt. attempt. Now,Now, for this,this, I have three use cases thatwe would we would intend to complete at the end of the video. video. First,, I want to find the totalnumber of number of failed login attempts thatthat were made to the server. server. So this is the first question that I have. have. The second question is:is: find the list ofIP addresses IP addresses from which the failed login attempts weremade, and made, and the third is:is: find a list of countries fromcountries fromwhich the which the failed login attempts were made. made. So these are the three questionsthat a that a typical security engineer or atypical DevOps typical DevOps guy might really ask for.So this is the use casecase that wewill be will be doing at the end of the video. video. So with this, let's start. start. Now that we know that this wire log securefile is file is not passed,passed, a lot of fields are not passed. passed. You have the option to write a regular expression. expression. So if you click on "Extract"Extract new fields,"fields," thesethese are the fields that you have here. here. So let me select a sample field. field. So this is a samplefield that field that got selected. selected. Now, if I do a next,next, it basically askswhether we whether we want to extract based on regular expressionsexpressions or delimiters. delimiters. So if I select a regular expression, thenI'll have I'll have to write a regular expression for parsing eacheach of these fields within my log source. source. However, this is something that Ido not do not want to do right away. away. We'll be looking into this, but not now. now. So one of the easiest ways inwhich we which we can take care of this isto download to download a SplunkSplunk addon thatthat contains thisregular expression regular expression and can automatically parse the data. data. veryvery important to remember. remember. So in order to do that, if I go toto SplunkSplunk Enterprise,Enterprise, this is basically the home page where Iwhere I'm going,going, andand I do find more apps here. here. Now within this I have to put the app,app, sosince this since this is a Linux file, I'll put the apps asLinux, and Linux, and there is a Splunk addon for Unix and Linux. Linux. So this plunk addon for Unix and LinuxLinux contains various regular expressionsexpressions that canautomatically parse automatically parse the Linux-specificLinux-specific log files. files. So I'll click on "on "install here." here." So basically,basically, you will have togive the give the Splunk username and password that youyou created,created, so I'll quickly give mine. mine. You have to select the terms and conditions andgo ahead go ahead and install the Splunk addon for Unix and Linux. Linux. In the upcoming video, weIn the upcoming video, we will be discussing more about what addonsare and are and what apps are in the upcoming video.But I just want to have a straightforward demo. demo. So once the addon is completed,completed,you will you will have to restart your Splunk. Splunk. So let's quickly restart it. Perfect. Perfect. So theSo the restart has been successful. successful. Let me quickly log in again,again, and once logged in,I'll go I'll go back to my home page and you will seethat I that I have a new add-onadd-on thatthat automatically got installed. installed. We will not touch it right now;now; we'llgo to go to our search engine reporting,reporting, and we'll basicallyselect the select the secure log file that we had selected earlier. earlier. I'll select all time,time, and we have around 9000 events. events. Now, as you can see on the left, I have a lot of fields that were automatically extracted. Now, as you can see on the left, I have a lot of fields that were automatically extracted. So if I open up a sample line now yousee I have so many fields which got extracted soearlier you had PID and process only which got extractedbut now you have a source also. also. Now,Now, source basically contains the source IP. IP. It also contains a field calledcalled "action,""action," andand action basically indicatesindicates whether the loginattempt that attempt that was made was successfulor whether or whether it was a failed login attempt. attempt. So this plant add-onadd-on thatwe had we had downloaded automatically had the rejects,rejects,and the and the rejects automatically parse the data. data. GoingGoing back to our usecase, the case, the first thing thatwe want we want to find is the number of failed login attempts. attempts. Now,Now, this would be pretty simple because if youopened up opened up the file,file, you would see that there isan action an action fieldand that and that this action field contains failure. failure. That means that the specific logline has a failed authentication attempt. attempt. Now on the left hand side,side, if yousee action see action within the interest feed,feed, there are twotypes of types of action:action: one is failure,failure,and the and the second is success. success. So now you see I have 90% failureand I have nine 9% success rate. rate. So most of the attempts thatthat wereweremade on made on the server were failures. failures. So this is how you can complete the first use case. case. The second use case here is findingfinding the list ofofIP addresses IP addresses from which the login attempts were made. made. So in order to do that, if you quicklyopen up open up the log file again,again, you willwill see thatthere is there is a field called "source""source" thatthat contains theIP address IP address from which the attempts were made from.If I do a source within this, you will see that there are more than 100 IP addresses defined. If I do a source within this, you will see that there are more than 100 IP addresses defined. Now if I select here,here, you will only geta specific a specific IP;; you will not get the whole list. list. So if you want to get thewhole list, whole list, you can basically make use of stats. stats. Command statistics are essentially statistics. Command statistics are essentially statistics. So you do a stats count,count, andand youyou have to give the field name. name. Now the field name here is SRC. SRC. So I'll say stats are weighted by SRC. So I'll say stats are weighted by SRC. PressPress enter,enter, and you will see it. it. Basically,Basically, it gives you the list ofIP addresses IP addresses from which the attempts were made. made. However, the use casesays to says to find the listof IP of IP addresses from which failed login attempts were made. made. This is important. important. Sohere are here are the IP addresses that you see,see,and they contain and they contain both success as well as failure login attempts. attempts. So now how you can optimiseoptimise this furtheris that you need to add action. action. So you see, you have two actions here. here. You have failure and success. success. So I'll say action is equal to failure. failure. And if I press Enter, it will give mea list a list of all the logs thatthat containcontain actionsactions thatthat failed. failed. Now, if you do a success here, you will see all the attempts at success. Now, if you do a success here, you will see all the attempts at success. So you see,see, all the failed attemptsare gone, are gone, and only successful attempts are present here. here. So I'll go ahead and do an action equalequal to lettingletting me know the exact name. name. So the action is equal to failure. failure. So either you can write it here or,or,even if even if you go to actions and click on failure, failure, it will automatically add it within the query bar. bar. Now we countcount statisticsstatistics by source. source. Now, the IP addresses that we will seesee nownow are the onesones fromwhich the which the failed attempts were actually made. made. So this fulfilsfulfils the second use case that we have. have. So the third use case that wehave here have here is findingfinding a list of countriesfrom which from which failed login attempts were made. All right? right? So now you need to have more information. information. You don't really need IP addresses because IP addresses,sometimes it does not really give you more sense. sense. If you know the countries,it will it will give you a better sense. sense. So for example, if your websites are hostedand your and your entire staff isin the in the India region,region, andyou see you see a lot of success attempts from someone in Russiain Russia, then there is something wrong, right? right? That is something that you need to bebe alerted to. to. So this is something that we need to lookto lookinto to see into to see how we can do that. that. So let's go back and I'll remove the tax command. command. Now, in order to findthe country the country associated with a specific field So one thing that Iknow is know is that theiPad address iPad address is in a field called SRC. SRC. Now, what I want is to map thatIP address IP address to a specific country or even to a specific city. city. Now, in order to do that in Splunk, Splunk, you have a command called IP Location. Location. So IP locationlocation actually allows you todo exactly do exactly the thing that we were discussing here. here. So how it would really work is you have todo IP do IP Location,Location, and basically you would have to give thename of name of the field where the IP address is,is, which is SRC. SRC. So I'll say "say "IP location SRC." SRC." andand basically you have to add a pipe here. here. So basically,basically, the results of this shouldbe piped be piped to the new command that we have. have. Now, if I quickly press Enter on thethe left-handleft-hand side, you will see thatthere are there are two interesting fieldsfields thatthat popped up. up. One is the country,country,and the and the second is the city. city. So now I can see that there are lotsof attempts of attempts which are made from various countries like Korea, Russia, Russia, Finland, Mexico,and Brazil, and Brazil, and it can even give youthe list the list of city names from which the attempts were made. made. Now, to make things more interesting, what we'lldo is, do is, since this specific use casesays to says to findthe list the list of countries from which failed login attempts were madewere made, let's make the query much better. better. So the next command that we'll makeuse of use of is the table command, because weneed to need to have information in tabular format. All right? right? So now, within the table,what are what are the things that we need? need? The firstThe first thing that we need is aa source. source. soso I'll say source. source. The next thing that we basically need is the city. city. So,So, I'll say city. city. The next thing that youwould typically need is country. country. So I'll include the country as well. So I'll include the country as well. All right, so let's make this property. property. And if I press Enter now, you wouldtypically see typically see that you have much better information. information. YouYou have the source, you have the country, country, and you also have city-specificcity-specific information here. here. All right? right? Now, the question that really comesup is: up is: where does Splunk get this information? information? Like Splunk,Splunk, itit does not build this list by itself. list by itself. A veryA very important part to remember is that is that thiscomes from comes from an external file,file, which Splunkhas by has by default installed within the folder. folder. Now, I'll quickly show youwhat exactly what exactly it might look like. like. So if I connect to a containeras a docker execut Splunk bash. bash. So now after I press Enter, Enter, I'll be logging into the container. container. So if you do a CD of Splunk so this isthe Splunk directory and there is a directory called Share. Share. So, if I share, you'll see that I have a MMDB file geolocated to City. So, if I share, you'll see that I have a MMDB file geolocated to City. So basically, this is the file from which alot of lot of information related to countriescountries and others comes from.So if I just remove the city and press Enter, I have very clean information relatedinformation relatedto the to the source as wellas the as the country. country. Generally,Generally,for a for a city, you might not really getget very precise information,but for a but for a country,, you really get good information for.Now, one last thing that I wanted to showyou before you before we conclude this video is the geostation command. command. So basically,basically, the commandscommands that we arediscussing over discussing over here are are all availablewithin the within the documentation for you to look forward to. to. So if you look into the IPLocation command, you also have a sample. sample. So like IP Location Client IP, you havevarious kinds various kinds of samples thatthat you can use and trytry out within your own test environment. environment. So let me show you something interesting because this is something I show to business people. So let me show you something interesting because this is something I show to business people. So,So, for example, a CTO comes and he wants towants tosee if see if he can'tcan't see this IP addressaddress, right? right? You have to show him a nice little graph. graph. ForForexample, if example, if a businesspersonbusinessperson comes, comes, he will not understand what IP addresses are. addresses are. But if you'll show him a nice little graph The graph is something that he'll be understanding.So we'll look into how we can create a graph. graph. So now you know. know. thatthat the IP locationlocation command command It creates a field called country. country. So what I'll do is is I'll use a command called geostats. geostats. I'll say Geostax countscounts by country. country. I'll press Enter. Enter. And now you see that within the visualization, visualization, I actually get a much better visualization. visualization. So I see that someone has logged in from Chinafrom Chinaor is or is trying to do a brute force attempt. attempt. There is someone from Russia;;there is there is someonefrom the from the United States. States. You have someone from Brazil. Brazil. So if youshow this show this in this specific way,it really it really becomes much simpler for the businessas well as well as for the security monitoring to happen. happen. Now, within the visualisationvisualisation page,there are there are a lot of Splunk visualizations. visualizations. The one that we are using currently is the map one. one. However, there are other visualisationvisualisation typesthat are that are also available thatthat you can use. use. The easiest one,one, definitely if you are going with thetheIP location IP location kind of statistics, is the cluster map. map. So it becomes much easier. easier. So with this, we'll conclude this video. video. I hope you have begun to understand the powerand how and how easy it is to do things in Splunk. Splunk. And I'm also sure that the use cases areare somethingthat you that you found to be useful as partof your of your initial learning. learning. And in the upcoming videos we will be discussing morein detail related to various other aspects of Splunk.

3. Search Processing Language (SPL)

Hey everyone and welcome back. In today's video, we will be discussing the Splunk Processing Language, which is also referred to as SPL. Now SPL is the heart of how well your Splunk rules are written and how quickly you can get an alert on a specific mechanism. Now, SPL basically stands for Splunk Processing Language, and it is basically a language with great capabilities that allows you to ask questions on top of data that is being indexed within your Splunk. Now a sample example of the questions that you might ask from the data is: find a list of IP addresses and country locations of failed SSH authentication attempts. So, if you have a list of the Warlock secure file that contains the failed authentication attempt and can find the list of IP addresses, you can use the IP Location Command to map those IP addresses to the country and, in some cases, city-specific locations. This is one example. A second example is to compare the list of IPs with IPsets in the Access Log and see what activities they perform. So this is something like a correlation search. So on the first example, what you're doing is finding the list of IP addresses that are trying to log into your SSH server in an authenticated way, or, I would say, the bad vectors. Once you have those IP addresses, make sure that the IP addresses from the first use case are compared within the Access lock file to see what activities they are performing on your website, such as browsing, running SQL injection, and so on. Now Splunk SPL provides around 140 commands, which basically allow us to search, correlate, analyze, and even visualise data. So let's understand things from scratch sothat we are on the same page. So generally whenever you log into a Linux box soif I do a LS, I do not find anydata because there is no files which are present. However, if I go to the Etc directory and do a ls, I can see that there are a lot of files that are present here. Now if I want to have a much more granular search, what I can do is do a LSI-P and L, which can give me a list of files. It will also give me the permissions, the timestamp associated, and even the size. Now if I want to search a specific file, I can make use of the Grep command and I can say "host." So now Linux will only give me files whichstarts or which has this specific keyword present. Now, for these commands that you see over here, if you want to run similar commands and ask questions about the data that is indexed, you need to run SPL commands. Direct Linux commands would definitely not work in Splunk. So Splunk has its own set of commands whichis also referred as SPL commands which allows youto have which allows you to query the datawhich is indexed within your Splunk instance now. If you go to the Splunk documentation, or simply type "Splunk search reference documentation" into Google, you will be presented with this documentation. Now within this, there is a tab called "search commands," and within the search command, you will see that there are a lot of commands that are present over here. All of these commands can be used for a specific use case depending on what exactly you intend to ask about the data that is indexed, whether you want to seach for it, whether you want to se Whether you want to have visualisation dashboards or various other tools, all of these commands are you see.They are also referred to as the SPL commands, and each command has its own unique reference, very similar to Linux. Every Linux command has a specific use case, and for that use case, you use a specific command. The same is true for Splunk, and your ability to understand these commands and use them effectively is critical because when you start ingesting a large amount of security logs in Splunk. how well you create dashboards. How well you create reports so that it can find the anomaly from the data is very important, so understanding this search command is important, but do know that understanding each and every search command is not required; it really depends on the use case just like you. No one would know each and every Linux t really dependNo one will know it whenever there's a use case. You look into the documentation and you just run the command similar to that. You do not really need to know each and every command, but you do need to know some of the commands that will be used at a regular rate and depending on the use case. You can understand various other commands, and since you have good documentation, that is something that is quite easy to do. So now that we understand the basics of what Splunk SPL is, let's look into the basic structure of SPL that you would generally see on the left hand side. First, you have the source filter. Then you have the mount. Then you have the report, and then you have the cleanup here, so the separator that is present over here is also called a pipe, and if you would have seen our Linux command, you would see we are using the pipe here. So what exactly happens here is that whenever I run the SLLs command, the results that you see over here would be piped to the grep command, so when we do this. What will happen is the result. Whatever results are present in this command, it will bepipe and grip will be able to see that result, and then it will be able to filter out those things. And the same goes with the basic structure of SPL. Well, whatever result comes out of your search filter will be piped to the second command, then the output of the second command will be piped to the third, and then it will be piped to the fourth. So we'll be looking into this basic structure in today's video. So the first thing that we are currently doing is creating a source that is equal to Access Log. Now, if you would see our Splunk, there are two files that we have currently imported. One is Linux-secure, and one is access combined.And within the sources, there are two. One is a secure log, and one is an access lock. So if I click on Access Log here and let me put the time as "all time," you see there are around 130 events over here. So this is the search filter. All right? So, this is the first search filter that you have over here. Now, again, you can also add various other queries here. So for example, you can say where clientIP is equal to 182, 236, 160, 411. And if I press Enter, it will only show you the events that are associated with a specific IP address. So this is referred to as the search filter. All right? So this is the search filter. Not necessarily. It needs to be one. It can be any search filter that can accurately point to the right set of data. So currently, since I added a client type here, the total amount of events has gone down to 129. So if this is the event set that you are looking for, then this is the right search filter. So any search filter that you write needs to make sure that it only points to the relevant events that are necessary for your investigation purposes. All right? Now, the next thing that we are doing is running the evaluation. So the Eval command is something that you would regularly see. So what exactly happens here is that if I expand this column, you would see that there is a field called "bytes." So you see, this is the bytes fieldwhich contains two to five to two. So for ease of reference, what you might want is youmight want to see data in Kilobytes or Megabytes or ifthe set is huge, it can be Gigabytes also, right? It becomes much more simple. So let's assume that you have a movie that you have on a computer, and if you want to see the size, and Windows shows you the size in bytes, it becomes really difficult. However, if Windows shows you the size in Megabytes or Gigabytes, it becomes much easier for you to understand, right? And same goes with Splunk also. So if you mind, or if you do not want to see things in bytes, you can convert them into an easy-to-understand understand manner.Now, when I say convert, it does not mean converting this specific value of bytes. So I have a bytes two to five two. It does not mean changing the value associated with the log file. All right, that's a very important part to remember. So let me quickly show you. So what you have here is you are doing an evaluation, and this is KB. So this is something that you can change according to the naming convention that you have. So eval KB is equal to bytes divided by 1024. So kilobytes is generally whateverbytes divided by 1024. So this is the basic logic. So KB is equal to bytes divided by one zero twenty four if I quickly do Eval. And if I run this command again, what you will typically see is that the byte fill is still present. So nothing is modified, only an additional field is added. Now, remember, this is not added inside the log file, so the raw log is not modified. This is just added as part of your search command. So there is a new field that is added that has a value of 219. So I can easily tell that this is two KB worth of data that was transferred in this specific get request. So this is what Eval does. So again, you can change this value, KB. But this specific logic that we are seeing in bytes divided by 100:24 is the query reference that you need to remember. Now, once you have done that, the third command that we are basically doing for reporting is stats. So stats are basically statistics. And what we're doing is doing a KB sum as well as a distinct count of client type addresses. So what this will basically do is that, let's assume, I want to see how much data was transferred during the session of this specific client IP address, all right? So I want to see that I have this suspicious client IP address. I want to see how much data was transferred during the session. So now I know that the first event has two KB of data, all right? So this is two KB of data. Now I look at the second event. The second event has 0.8 KB of data and there are 129 events. Now, what I want to do is do a sum of all those events and get the total value of how much data was transferred. And for that, you make use of the stats command. So if I just do some stats here, let's do some stats, and we are doing some in KB, let me press Enter, and it will quickly show you that a total of 252 KB of data was transferred to this IP address based on the logs that we have. So, stats is also a pretty interesting command that you can use for various references, and the last command is the rename command. So basically, if you will see over here, I have some KB. So this is not a very, I would say, "readable" form. You need it to be much easier to understand the field name. So what you do is rename the sum of KB to total kilobytes. So if I add one more filter over here and press Enter, you see now that it makes much more sense, right? You have 252 total kilowatts and total kilobytes. So now it becomes much easier for a person who is looking into the report because this can be sent as a report to email in a PDF format. So anyone who is reading the report should easily understand what this specific field value is all about. Now, one last thing that I will quickly do to make sure that we stick to the references that we have in our PPT is that I'll remove this client IP address here and add one more value, which is DC, which is the distinct count of the client IP. So basically, let me quickly remove this for the time being. So currently you would see that there is a field value called "client IP." So basically, what you want is to find the total list of data transferred in terms of kilobytes and also the number of IP addresses that were involved in the transaction. All right, so the first thing we have is the log file, which we want to search. Second thing we are doing is we are doing an eval. So this is a quick evaluation where we are having a KB feed, which is bytes divided by 1024. The third field that we have here is statistics. In statistics we are doing a sum of whatwe are doing some of KB and we arealso doing a distinct amount of client type. So let me quickly run this, and if I run this year, you would see that there are a total of 182 IP addresses, and the total sum of KB is 27799. So this is the total amount of KB that was transferred, and the last thing that we have already seen is that we are renaming this feed. So if I quickly add this up,you see you have total kilobytes. This is the total amount in kilobytes. You have unique customers. You have 182 unique IP addresses within a specific time period. Now this is something that you can save it asreport and every 24 hours you can send this reportto the management that today we had this much amountof people visiting from and this much amount of datawhich was transferred and this can be scheduled letter 24hours period or whatever you really need. Now, one last thing that I would like to show before we conclude this video is that whatever commands that we are typing over here, you can easily find them in the reference manual here. So if you would like to see a stats command, let me open a stats command for a sample. Now this is the stats command. It has a good amount of description, and one more good thing is that Splunk has some basic examples where you can see they have given you examples. So we're now at an average of Kbps. So this is basically the answer if you want to see the average bandwidth. So you can make use of AVG; we had used some, but you can also use ABG and various others. So, if you quickly go a little higher, you'll notice that you have some here: average, count. So these are all functions that you can use with us. That's a SPL command. So a pretty lengthy documentation exists, and you can try this out to understand a specific SPL command in detail. We will be discussing various other SPL commands in the upcoming videos, but would just like to share that the documentation is pretty good, based on which you can also try it out.

4. Splunk Search Assistant

Hey everyone and welcome back. In today's video, we will be discussing Splunk Search Assistant. Now, Splunk Search Assistant is a pretty useful feature when you are writing a query in terms of SPL language within Splunk Query Editor. Now, generally, whenever you begin typing certain characters, it is in the Splunk Search bar. You might have noticed that Splunk is giving you matching terms as well as matching searches. So this autofill-like feature is accomplished with the assistance of Splunk Search Assistant. A similar feature you may have noticed in Google is when you type something and Google displays a list of suggestions. So that is also something like an autofill. Now, when it comes to Splunk Search Assistant, there are three modes in which it operates. One is full, another is compact, and the third is empty. Now, by default, Splunk Search Assistant runs in compact mode. However, this is something that you can change. Now let me quickly show you. So if you see over here within the search bar, if I do a stats count, it is actually giving me a lot of results. If I remove statistics, if I type "table," it is automatically giving me various results over here. So all of these results come from two sections. One is suggestion, and the second is history. So within table, if I type table, you see that it is showing me command history. So these are the SPLs that I used in the earlier videos. That is why it is showing my command history. However, if you use that, there is a lot of command history that you can find here. So this is basically the compact editor. Now if you want to change it, you can go to your user account and go to Preferences. And within the SPL Editor here, you will see there are three searches. By default, it is in compact mode. So I'll select it as full mode, and I'll click on Apply. So once I click on Apply, the way in which search works is a little different. It will be much more intuitive and much easier to read. So let's try it out once again. So if you see now that something has changed, it has become much bigger now. So now if I do a stat, what it will do is show me the command history. It also shows me statistics-related examples on the right side. So if you do a stats count, buy. Now if you click on more here, it will show you the documentation syntax. So here you can easily see the syntax. It also shows you various examples over here. So it becomes much easier to remember. So if you do not really know the SPL syntax exactly, you can make use of Full Mode. You can read it directly. You can read the basic documentation. Instead of going through the documentation through the Splunk.com page, you can directly go through here, and you can write your queries over here. So it becomes much easier. Now, it really depends on you whether you want to use Compact mode or Full mode; if you're a pro, then you can use either or neither. However, I generally prefer to use full mode, but this is a personal preference. Now, one last thing that it is important for us to remember is that if you are making full mode for your user account, it will not affect others. So it might happen that there are ten users who are using Splunk. As a result, any changes you make to your searchspread real editor will have no effect on others. Everyone can have their individual Splunk search assistant modes.

5. Splunk Reports

Hey everyone and welcome back. In today's video, we will be discussing Splunk reports. Now, the best way to understand concepts is through a sample use case. So let's look at a sample use case. Now, you have a security manager, and there is a use case that a security manager will need to be informed of the list of IP addresses and country locations from which failed SSH attempts are being made over a period of 24 hours via email. So what the use cases? The use case is that a security manager within your organisation needs a list of IP addresses and their associated countries from which failed SSH authentication attempts are made at a daily interval. Now, this is something which is youwill see in lot of organizations. So let's look into how you can achieve this specific possibility. So let me go to the secure log. I'll select all time. Now, we'll move quickly because inaction equals failure. So this is first part. Now, since the country details are needed, I'll do IPlocation source, and this will basically give me the country-specific details, and the last thing that I would need is a table of source, IP address, and country. So these are all the results that are present. Now, one important part to remember is thatthese results can be exported from Splunk. Now, if you will notice, you have an option for export. Now, I can export this in various formats like CSV, XML, JSON, and others. So let's select CSV, and the file name that I'll give is failed SSH attempts. Now, for a number of reasons, I'll just leave it blank and go ahead and export it, and you will see that it got exported. So if I click on the CSV file, you will see that I have two specific fields that are available. One is source, and the second is country. So this is something that I would really like. Now, along with that, one thing that you might want to remember is that currently, you see that there are a lot of IP addresses that are getting repeated. Basically, you will see this as a brute force pattern. So generally, if you do not want the same IP address to be repeated 100 or 1,000 times, you can make use of unique parameters. Unique will make sure that whatever IP addresses you have over here are not repeated. So you'll only have a unique set of IP addresses that are present over here. So once you have done that, what you'll do is you have to automate this because the security manager wants this report at a 24-hour interval. So one thing is that every 24 hours after you run this query, you export the result and send it to a security manager. But if you are a lazy person like me, you might want to automate it completely. So Splunk would do that for you. So, once you've written the query and received the expected results, you can save it. So if you see there's an option called Save As, you can save this as a report. Now the report title is "SSH bruteforce attempts," and the time range speaker is I'll leaveit as default of yes I'll click on Save. Now it is saying that your report has been created, and I'll click on View. Now this is the report that was created. Because we enabled time range speaker, we can now click on a specific time range and our report will be updated accordingly. Now this is a report that is created. Now, what we want is that report. This specific report should be running at a 24hours interval so if you click on Edit hereyou have various options like Edit Description, permission scheduleschedule is something that we are interested in soI click on Edit Schedule. Now "Schedule report" is a checkbox that I'll select. Now you can run it every hour, every day, every week, every month, or you can even have your own custom chron schedule, so I'll run it every day at 12:00 a.m. In the everyday Now the time range you can select the time rangenot required since you are running it at 12:00 A.m.. Now, the priority is important, so if you have a resource limitation, you can have this report as the highest priority if it's important for you to be running. Now, the trigger actions are the post actions that would happen after the report is created, so whether you want this report details to be sent in email, or some type of script should run, or there's an external web hookwhich you can integrate with software like Slack or others, we can select on "send email" and y Now the subject is Planck Report, and the report name and include you can attach a CSV, a PDF, or even a link to the report and link to results, and you can go ahead and click on Save, and this is why you see this report is currently scheduled and you don't see anything yet because it has yet to run. Now, in case you do not know where to find your reports, you can go to the main Splunk home page. Click on "Search and Reporting." Go to Reports, and here you should see your report, which is titled "SSH brute force attempts," and within it, if you just expand it, you would see what the action is that is scheduled and at what time the report is supposed to run.

Go to testing centre with ease on our mind when you use Splunk SPLK-2002 vce exam dumps, practice test questions and answers. Splunk SPLK-2002 Splunk Enterprise Certified Architect certification practice test questions and answers, study guide, exam dumps and video training course in vce format to help you study with ease. Prepare with confidence and study using Splunk SPLK-2002 exam dumps & practice test questions and answers vce from ExamCollection.

Read More


Comments
* The most recent comment are at the top
  • pedro_0017
  • India
  • May 12, 2020

i’ve passed my exam finally though i didn’t have enough time to study for it. examcollection website was my savior…it provided me with helpful practice questions and answers for SPLK-2002 exam. the actual exam was just a replica of what i came across during my training. good work guys!!! keep on!!!

  • May 12, 2020
  • monica102
  • Israel
  • May 10, 2020

i always thought that the exam questions would be difficult and tricky((. but after getting the SPLK-2002 exam dump for my revision, i was the happiest person ever after the results were out. i never imagined this file could make passing the test this easy peasy for me. thank you guys!!

  • May 10, 2020
  • vivian
  • United States
  • May 08, 2020

Splunk SPLK-2002 practice questions and answers helped me to achieve my aspirations of passing the exam and become Splunk Enterprise Certified Architect. yeah!! any candidate waiting for the test can make use of them and excel like me

  • May 08, 2020
  • austin_CJ
  • Myanmar
  • May 04, 2020

@betty2000, yup, braindump for SPLK-2002 exam is actual and updated. all the questions comprised in it are within the exam context. for sure, you’ll organize the learned information and learn something new from it. good luck!

  • May 04, 2020
  • yusuf
  • Sri Lanka
  • May 03, 2020

practice test for SPLK-2002 exam in this site is sooo helpful. it tests a knowledge you get during your training. before i downloaded it, i already trained myself on all the topics in the real exam. i answered all the questions it contains and managed to score 91% in my exam!!!

  • May 03, 2020
  • betty2000
  • India
  • May 02, 2020

hey guys, is the SPLK-2002 braindump valid? i’d like to get more knowledge on top of what i’ve learned from the official Splunk Enterprise Certified Architect training course…can someone help??? thx in advance

  • May 02, 2020
  • vincent
  • Spain
  • Apr 28, 2020

examcollection is undoubtedly the number one destination to find useful prep materials for IT exams! Their dump for SPLK-2002 exam really helped in my revision. It enlightened me on the exam concepts and familiarized me with the questions I expected. So I recommend this invaluable training material to everyone.

  • Apr 28, 2020
  • saif Al-Shoker
  • Germany
  • Oct 02, 2019

i would like to get the exam questions

  • Oct 02, 2019
  • sreekar
  • United States
  • Sep 16, 2019

looking for more splk architect practice exams.

  • Sep 16, 2019

Add Comment

Feel Free to Post Your Comments About EamCollection VCE Files which Include Splunk SPLK-2002 Exam Dumps, Practice Test Questions & Answers.

Purchase Individually

SPLK-2002 Premium File

Premium File
SPLK-2002 Premium File
90 Q&A
$76.99$69.99

SPLK-2002 Training Video Course

Training Course
SPLK-2002 Training Video Course
80 Lectures
$27.49$24.99

Top Splunk Certifications

Site Search:

 

VISA, MasterCard, AmericanExpress, UnionPay

SPECIAL OFFER: GET 10% OFF

ExamCollection Premium

ExamCollection Premium Files

Pass your Exam with ExamCollection's PREMIUM files!

  • ExamCollection Certified Safe Files
  • Guaranteed to have ACTUAL Exam Questions
  • Up-to-Date Exam Study Material - Verified by Experts
  • Instant Downloads
Enter Your Email Address to Receive Your 10% Off Discount Code
A Confirmation Link will be sent to this email address to verify your login
We value your privacy. We will not rent or sell your email address

SPECIAL OFFER: GET 10% OFF

Use Discount Code:

MIN10OFF

A confirmation link was sent to your e-mail.
Please check your mailbox for a message from support@examcollection.com and follow the directions.

Next

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.