hello Team
Could anyone please help me with metrics alert event name count functionality? I have a use case that requires calculating the average number of alerts received per host from Defender . Can this metric be used for that purpose? Also, does it include Defender alerts by default? i only need defender alerts
This Splunk query I need to translate to yara is designed to monitor and analyze security alerts generated by Microsoft Defender Advanced Threat Protection (ATP) over a 7-day period. The goal is to identify hosts that are generating an unusually high number of alerts compared to their average hourly alert count.
Thanks in advance
Can you please share the splunk query.
please find this snip
Need it in text please
index= cloud_test sourcetype="security-alerts-Microsoft Defender ATP" data.severity=* data.title!="Automated investigation started manually" data.title!="*EnterpriseBlock*" data.title!="Open Wi\-Fi connection" ```return the average number of alerts per hour by signature name over a 7 day sample period```
| bin span=1h _time
| rename data.hostStates{}.fqdn as hostname
| rename data.severity as severity
| rename data.title as title
| stats count as total_count by _time, hostname, title, data.sourceMaterials{}, severity
| mvcombine data.sourceMaterials{}
| eventstats avg(total_count) as average by title ```returns average hourly event count of each title```
| where _time > relative_time(now(), "-1h@h") ```comment out to test over larger time spans, or go -1h@h for a 1 hour alert```
| where total_count > average * 3 ```trigger if host exceeds triple the average hourly alert count - set to *1 to test```
| eval org_index="cloud_test"
| eval org_sourcetype="security-alerts-Microsoft Defender ATP"
| eval vendorProduct="Microsoft Defender for Endpoint"
```SOC Required Fields```
| eval search__name="High Volume Defender Signature (Any Severity)"
| eval search_summary=$title$ + " - signature spiked 3 times the average number of alerts over 7 days. Suggested Steps: Determine if these alerts are a related broader incident. Look for similar IOCs/behaviors across these alerts that could indicate large attack is underway. Please see Defender SOP for additional details on investigating MDE alerts."
| eval alertCategory="Endpoint"
| eval customer="Cloud"
| eval environment="Cloud"
| eval alertType="Splunk"
| eval priority=2
```CEE Added Fields```
| eval alertName="High Volume Defender Signature (Any Severity)"
| eval attacktags="attack.impact:ta0040"
| eval falsepositives="unk"
```Output```
| fillnull value=NULL
| sort - _time
| table total_count, title, data.sourceMaterials{}, severity, org_index, org_sourcetype, vendorProduct, search__name, search_summary, alertCategory, customer, environment, alertType, priority, alertName, attacktags, falsepositives
index= cloud_test sourcetype="security-alerts-Microsoft Defender ATP" data severity=* data title!="Automated investigation started manually" data title!="*EnterpriseBlock*" data title!="Open Wi\-Fi connection" ```return the average number of alerts per hour by signature name over a 7 day sample period```
| bin span=1h _time
| rename data hostStates{} fqdn as hostname
| rename data severity as severity
| rename data title as title
| stats count as total_count by _time, hostname, title, data sourceMaterials{}, severity
| mvcombine data sourceMaterials{}
| eventstats avg(total_count) as average by title ```returns average hourly event count of each title```
| where _time > relative_time(now(), "-1h@h") ```comment out to test over larger time spans, or go -1h@h for a 1 hour alert```
| where total_count > average * 3 ```trigger if host exceeds triple the average hourly alert count - set to *1 to test```
| eval org_index="cloud_test"
| eval org_sourcetype="security-alerts-Microsoft Defender ATP"
| eval vendorProduct="Microsoft Defender for Endpoint"
```SOC Required Fields```
| eval search__name="High Volume Defender Signature (Any Severity)"
| eval search_summary=$title$ + " - signature spiked 3 times the average number of alerts over 7 days Suggested Steps: Determine if these alerts are a related broader incident Look for similar IOCs/behaviors across these alerts that could indicate large attack is underway Please see Defender SOP for additional details on investigating MDE alerts "
| eval alertCategory="Endpoint"
| eval customer="Cloud"
| eval environment="Cloud"
| eval alertType="Splunk"
| eval priority=2
```CEE Added Fields```
| eval alertName="High Volume Defender Signature (Any Severity)"
| eval attacktags="attack impact:ta0040"
| eval falsepositives="unk"
```Output```
| fillnull value=NULL
| sort - _time
| table total_count, title, data sourceMaterials{}, severity, org_index, org_sourcetype, vendorProduct, search__name, search_summary, alertCategory, customer, environment, alertType, priority, alertName, attacktags, falsepositives
hello @dnehoda could you please have a look on above splunk query