Which of the following statements applies to indexer discovery?
Answer : D
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/DistSearch/Connectclustersearchheadstosearchpeers
The data in Splunk is now subject to auditing and compliance controls. A customer would like to ensure that at least one year of logs are retained for both
Windows and Firewall events. What data retention controls must be configured?
Answer : A
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Indexer/Setaretirementandarchivingpolicy
What happens when an index cluster peer freezes a bucket?
Answer : C
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Indexer/Bucketsandclusters
A customer has the following Splunk instances within their environment: An indexer cluster consisting of a cluster master/master node and five clustered indexers, two search heads (no search head clustering), a deployment server, and a license master. The deployment server and license master are running on their own single-purpose instances. The customer would like to start using the Monitoring Console (MC) to monitor the whole environment.
On the MC instance, which instances will need to be configured as distributed search peers by specifying them via the UI using the settings menu?
Answer : C
What does Splunk do when it indexes events?
Answer : B
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Indexer/Howindexingworks#:~:text=Splunk%20Enterprise%20can%20index%20any,events%
20indexes%20and%20metrics%20indexes
What is the default push mode for a search head cluster deployer app configuration bundle?
Answer : B
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/DistSearch/PropagateSHCconfigurationchanges#:~:text=The%20deployer%20push%20mode%
20determines,default%20push%20mode%20is%20merge_to_default%20
In which of the following scenarios is a subsearch the most appropriate?
Answer : A
A customer has implemented their own Role Based Access Control (RBAC) model to attempt to give the Security team different data access than the Operations team by creating two new Splunk roles "" security and operations. In the srchIndexesAllowed setting of authorize.conf, they specified the network index under the security role and the operations index under the operations role. The new roles are set up to inherit the default user role.
If a new user is created and assigned to the operations role only, which indexes will the user have access to search?
Answer : A
A customer would like Splunk to delete files after they"™ve been ingested. The Universal Forwarder has read/write access to the directory structure. Which input type would be most appropriate to use in order to ensure files are ingested and then deleted afterwards?
Answer : B
Reference:
https://community.splunk.com/t5/Getting-Data-In/Is-it-possible-to-have-a-Splunk-universal-forwarder-read-a/td-p/172752
In which directory should base config app(s) be placed to initialize an indexer?
Answer : B
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Indexer/Manageappdeployment
As a best practice which of the following should be used to ingest data on clustered indexers?
Answer : B
When adding a new search head to a search head cluster (SHC), which of the following scenarios occurs?
Answer : C
A customer wants to migrate from using Splunk local accounts to use Active Directory with LDAP for their Splunk user accounts instead. Which configuration files must be modified to connect to an Active Directory LDAP provider?
Answer : C
Reference:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Security/ConfigureLDAPwithconfigurationfiles
A customer has a number of inefficient regex replacement transforms being applied. When under heavy load the indexers are struggling to maintain the expected indexing rate. In a worst case scenario, which queue(s) would be expected to fill up?
Answer : B
A new single-site three indexer cluster is being stood up with replication_factor:2, search_factor:2. At which step would the Indexer Cluster be classed as "˜Indexing Ready"™ and be able to ingest new data?
Step 1: Install and configure Cluster Master (CM)/Master Node with base clustering stanza settings, restarting CM.
Step 2: Configure a base app in etc/master-apps on the CM to enable a splunktcp input on port 9997 and deploy index creation configurations.
Step 3: Install and configure Indexer 1 so that once restarted, it contacts the CM, download the latest config bundle.
Step 4: Indexer 1 restarts and has successfully joined the cluster.
Step 5: Install and configure Indexer 2 so that once restarted, it contacts the CM, downloads the latest config bundle
Step 6: Indexer 2 restarts and has successfully joined the cluster.
Step 7: Install and configure Indexer 3 so that once restarted, it contacts the CM, downloads the latest config bundle.
Step 8: Indexer 3 restarts and has successfully joined the cluster.
Answer : A
Have any questions or issues ? Please dont hesitate to contact us