My Network Has High Cholesterol: The Danger of Low and Slow DDoS Attacks

This month we share a blog article from our technology partner, Radware. Low and slow DDoS attack traffic is increasingly becoming a strain on business resources, consuming bandwidth and in turn, resulting in a poor customer experience and financial loss – in most instances the targeted organization will absorb such traffic without even noticing or knowing

You know when you get to a certain age, feeling ‘good’ just isn’t enough? Well it might be, for your everyday life – obviously you don’t need to extract the most out of your brain and muscles for the day-to-day to-do’s, but there is no guarantee that there is nothing else there that negatively impacts your performance, working in the shadows, going unnoticed.

A network needs regular checks to ensure it is working at its full potential.
A network needs regular health checks

In this information age, businesses rely heavily on the secure and reliable transfer of data and it needs to be clear for whoever receives it, whether that be man or machine. Whilst most of us who generally feel ‘good’ don’t worry so much about health checks – certainly not on a daily basis – as long as we can go about our daily lives, IT teams invest most, if not all of their time making sure the information system performs at its best. Why? Because time is money.

If you’re keeping up to date with the industry, over the past year you may have started to think IoT Botnets are taking over and bringing enterprise networks down on their knees, the reality, however, is different. Despite the record-breaking volumes we have seen in 2016, non-volumetric DDoS is still prevalent. This technique is still proving to be very efficient in exhausting network and server resources. Moreover, a non-volumetric attack
can evade detection mechanisms and consume bandwidth and resources without the target knowing—affecting service-level quality.

56% of Internet traffic is bots1

56% of internet traffic is bots
Many bots are actually designed to make the Internet run more smoothly

Thankfully not all bots are bad bots, some are useful and well-regulated. For example, search engine crawlers, automated trading and instant media updates. Yet, unfortunately, a significant amount of Internet traffic is generated by bad bots; from spammers and click fraudsters to vulnerability scanners and malware spreaders.

Our Cloud-Based WAF uses advanced bot identification technology to differentiate legitimate web application users including human beings and search engines from malicious traffic.

Attack Size: Does It Matter?

In 2016, less than 10% of server attacks qualified as ‘extra-large’ (10Gbps or higher). 7 in 10 server attacks were below 100Mbps; of which 50% were 10Mbps or less. Despite the increasing risk from IoT Botnet attacks, those ranging from 10Gbps to 50Gbps decreased from 8% in 2015 to 3% in 2016.2 But why?

DDoS attackers are becoming more sophisticated and familiar with the security solutions in play. They are aware that the protective measures implemented are unable to distinguish between legitimate user traffic and bad traffic so they opt for low and slow attacks or alternatively, short bursts.

What are the three biggest cyber-attacks you have suffered by bandwidth?
Source: Radware Global ERT Report 2016-2017

Three in five respondents report a cyber-attack that is 10 million packets-per-second (PPS) or less, and about one-fifth indicated they suffered an attack between 10 million PPS and 100 million PPS. The number of attacks that were 100 million PPS or less increased from 76% in 2015 to 82% in 2016. Those with 10 million PPS or less increased from 50% in 2015 to 63% in 2016.2

Prevalence of low and slow attacks: What are the biggest cyber-attacks you have suffered by bandwidth?
Source: Radware Global ERT Report 2016-2017

Businesses Lose

Under these conditions, both the network and the business won’t perform at their potential. It may run, like someone ‘feels good’ – but feeling good can be deceiving. Whilst everything may seem like it is functioning, resources of the organization are still being consumed, whether this is through share-of-pipe serving dirty traffic or man hours in log analysis. In today’s world, a user expects a rapid response from any app or web page and often if it hasn’t loaded in around 3-5 seconds they try elsewhere. This is why it is paramount to ensure you are on top of traffic flowing in and out of your business.

The Importance of DDoS Monitoring:

To maintain the trusted reputation that organizations work so hard to build, a business should be aware of the dangers out there and be able to assess the impact they could have on their operation.

The low and slow attacks don’t always reach the threshold of most rate-limit DDoS protection and therefore often go undetected. Traffic purification can only be done by a DDoS solution that leverages a behavioural DDoS traffic analysis algorithm that learns the baselines and patterns of legitimate requests in peacetime and maintains the peace by cleaning unwanted requests as they come in.

Getting the network health back into shape will immediately translate into a better performance and that ‘good’ old feeling.

At activereach we offer a vendor neutral approach to DDoS Mitigation as well as our own DDoS Testing platform so you don’t have to wait to be attacked to know if your mitigation works or not.

This article was first published by Ben Zilberman on the Radware blog on June 22, 2017.

[1] Shaw, Jason, and Kiril Tsemekhman. “Combating Fraud in Online Advertising.” Integral Ad Science, Web Sep. 2014

[2) Radware Global Application & Network Security Report 2016-17