Using alerts for cybersecurity is one of the most pressing challenges, both in networks and in general within the computing world. Although there has been a surge of monitoring techniques to catch attacks and secure systems, there are always unanalyzed alerts that are assumed to be an error rate. This paper studies just this. Why are alerts by certain sensors essentially false signals, and how can we reduce these?
The paper presents a cyber security operations center (CSOC) manager that can look at logs of alerts and compare them to the risk per sensor. The tool uses an optimization model to assess sensor health and recorded alerts that is based on goal programming methods. The model uses three goals: (1) maximize the number of significant alerts by the system, (2) balance the risk and sensor alerts in terms of batches of collected data, and (3) minimize the disruption caused by the number of changes as a result of sensor alerts.
The paper is well written and uses a number of case studies to analyze the optimization tool. Comparing results to the reallocation of workloads, this paper is very relevant to a number of areas, including fault tolerance, autonomous computing, and optimizing compute during runtime. The results are analyzed through simulations, but can have an impact in real-world use cases as well.