For decades we’ve been hearing how automation is revolutionizing manufacturing and industry, with new technologies and systems generating major productivity and quality gains. We have seen how effective it can be to automate workflows and controls to drive efficiencies and produce results, fast. Yet, despite all the benefits associated with automating high frequency, repetitive tasks, we still haven’t seen a lot of transformation within high tech markets, particularly within IT’s cybersecurity functions.

Attackers are using automation. Why aren’t you?

Change, however, is coming. It has to, if organizations are going to have any chance to defend their networks and assets against cyberattacks. We know attackers are not waiting. They are increasingly taking advantage of low CPU costs and on-demand, automated attack mechanisms to launch successful exploits at their targets. We’ve seen botnets and attacks, such as DeathbyCAPTCHA and Sentry MBA, wreak havoc on networks, steal valuable data, and generate millions of dollars in damages.

To keep up with the frenetic pace of cyber threats, organizations are going to need to rely more heavily on automation. First and foremost, this means taking a more proactive approach to cybersecurity and throwing out the “if it ain’t broke, don’t fix it” mentality. Organizations are going to need to start to adopt new technologies that allow them to automate the detection, investigation and response to threats, so they can be handled in real-time. It should no longer be acceptable to measure incident handling in terms of days, even months.

To accomplish, organizations are going to need to start to automatically patch vulnerabilities and update outdated systems. Research shows that 90 percent of organization face attacks on application vulnerabilities that are at least three years old. Even worse, 60 percent of those attacks targeted vulnerabilities that were 10 years old.

Automation reduces alert fatigue.

In addition, they are going to need to use automation to supplement or even completely supplant the majority of today’s manual IT functions. Daily, manual tasks eat up most of a security analyst’s or IT administrator’s time. On average, Forrester found it takes 35 minutes to diagnose a single trouble ticket or issue (such a network connection or application outage), which doesn’t include the amount of time it takes to actually fix that issue.

This also doesn’t include all the other alerts that an analyst may have in their queue to investigate. Ponemon found that organizations get an average of 17,000 malware alerts weekly, most of which end up not being actual threats. But, analysts don’t know which are real and which are false alarms until they look into each. This is clearly not possible to do manually, which is why only 4% of valid threats end up getting investigated.

Even if an organization could staff up their network or security operations center (NOC/SOC) to handle the ever-increasing number of alerts and issues they are facing, it would be cost prohibitive and generally untenable. Thanks to the well-documented shortage of cybersecurity analysts organizations can’t even try to throw staff and money at the issue, but if they did, they would still end up short.

Reduce the risk of human error and costly mistakes.

Relying on humans to do mundane, repetitive tasks, daily is inevitably going to lead to mistakes. In the world of cybersecurity, mistakes are costly. Research from Gartner suggests that, through 2020, 99% of firewall breaches will be caused by simple firewall misconfigurations, not flaws. What’s more, IBM Security Services’ 2014 Cyber Security Intelligence Index reported that misconfigurations are most commonly due to human error. Adopting automation is one of the best ways organizations can get a handle on their daily activities and reduce the noise and potential for error that is so dangerous to their ongoing operations.

Gartner expects that only 30% of enterprise network changes will be made using the command-line interface (CLI) by the end of 2020. That’s a significant change from what they saw in 2016, when 82% of enterprise network changes were done manually. That expected change isn’t so radical when the business pressures and realities that we’ve been discussing are considered, however, it’s still fast when looking at the hesitancy organizations have displayed to adopt these newer, automated technologies.

A Ponemon study found that “Extensive use of cyber analytics and User Behavior Analytics (UBA)” and “Automation, orchestration and machine learning” were the lowest ranked technologies for enterprise-wide deployment (32 percent and 28 percent respectively), even though they provide the third and fourth highest cost savings for security technologies. We need to flip this statistic on its head, if we are to make the strides needed cybersecurity. Luckily, it’s never been easier.

Automation is the key to a steady-state security stance.

The adoption of software-defined networking and cloud computing models, as well as more coordinated DevOps capabilities, is creating a more agile IT environment. One that is better able to scale up and down and adjust to address changing business needs, making it easier to adopt automation technologies that can orchestrate activities between different products and teams to get better security outcomes.

Of course, there is no silver bullet to cybersecurity, but, with automation, there is a way to improve the performance and effectiveness of the overall security infrastructure to create a steady-state security stance. This allows analysts and IT administrators to eliminate repetitive, manual activities from their daily tasks, so they can focus on the things that really help their organization improve their overall risk management to prevent threats and keep their resources and operations intact.