Augmenting the Analyst: Using data science, training, tools, and techniques to enhance performance – Security Boulevard

The increasing demand for cybersecurity analysts is a combination of playing catch-up, keeping up with growing threats/attacker capabilities, and a globally expanding IT footprint. With relief for the growing security skills gap nearly a decade out, we must find ways to support the analysts that are already working to protect us. In this blog, we discuss ways to augment their efforts and maximize their time by overcoming some of the key challenges they face.

Why do we need to augment our analysts?
The global cybersecurity landscape is in crisis due to the lack of available skilled talent. A recent U.S. survey by Emsi Burning Glass (now Lightcast) showed that one million cybersecurity professionals are working in the industry, yet there are more than 700,000 open roles to be filled. The situation is similarly critical throughout Europe according to LinkedIn data, which indicates a 22% increase in demand for talent last year alone with no sign of slowing down.

Educational institutions, government efforts, and private training programs are creating new candidates as quickly as possible, but it takes five to ten years to create an experienced L3 security operations center (SOC) analyst. That’s clearly a solution for the future. So, what do we do in the meantime?

What about artificial intelligence, machine learning, and data science?
Many people believe that machine learning (ML) and artificial intelligence (AI) are going to replace SOC analysts. But that’s not going to happen, at least any time in the next couple of decades.

Yes, we have self-driving cars, and yes, a self-driving car that drives on the road without crashing is impressive. But they are as much enabled by advances in computer vision as they are by AI/ML, Using the same tools to decide if a 10,000-endpoint company network is secure is like keeping 10,000 cars on the road simultaneously when you’re not 100 percent sure where you’re going or what the road looks like.

AI/ML techniques aren’t magic bullets to solve the whole problem. They are a collection of solutions to very specific parts of the problem, such as inferring facts about security data that may be difficult or impossible for a human to determine. For example, AI/ML can detect a predictable pattern to user logon failures which highlights it as an automated activity that’s using low and slow timing to try and evade detection. Or it can identify anomalous user behavior and connect it to other anomalous system activity – such as when an admin suddenly logs onto the system at 3:00am from a new location.

Does the use of AI/ML need any extra training?
Data science is a vocation that most security analysts are not skilled or experienced in. AI/ML systems have started to help stem the tide of alerts, but it can become problematic if analysts are not able to understand what these tools are doing.

Early AI/ML tools, for instance, were famous for presenting a result such as “anomalous behavior detected,” but with no context for the analyst to determine why the behavior was anomalous. The lack of insight had the potential to devolve analysts into a state of environment blindness, allowing critical threats to go unnoticed.

Training provides benefits because security operations center (SOC) analysts want to improve the way they work. It’s baked into every modern SOC as the core principle of continual improvement. If we give analysts additional ways …….

Source: https://securityboulevard.com/2022/08/augmenting-the-analyst-using-data-science-training-tools-and-techniques-to-enhance-performance/

Leave a Reply

Your email address will not be published. Required fields are marked *