DDoS and web application attacks keep escalating

Akamai Technologies released its Second Quarter, 2016 State of the Internet / Security Report, which highlights the cloud security landscape, specifically trends with DDoS and web application attacks, as well as malicious traffic from bots.

web application attacks keep escalating

During May 2016, the number of attacks spiked, fueled by campaigns targeting the gaming industry

“While attack sizes are decreasing, we continue to see an uptick in the number of attacks as launch tools grow increasingly pervasive and easy to use and monetize,” said Martin McKeay, Editor-in-Chief, State of the Internet / Security Report.

DDoS Attacks

  • Total DDoS attacks increased 129 percent in Q2 2016 from Q2 2015. During the second quarter, Akamai mitigated a total of 4,919 DDoS attacks.
  • Akamai observed its largest DDoS to date at 363 Gbps on June 20th against a European media customer. At the same time, the median attack size fell by 36% to 3.85 Gbps.
  • Twelve attacks observed during Q2 exceeded 100 Gbps and two that reached 300 Gbps targeted the media and entertainment industry.

“Arbor Networks’ recent 1H 2016 DDoS report aligns with much of what Akamai’s State of the Internet report found. The size, scale and frequency of DDoS attacks continues to grow at alarming rate. Through the first half of 2016 Arbor’s Active Threat Level Analysis System (ATLAS) recorded a peak attack size of 579Gbps, a 30% increase in attack volume and 274 attacks over 100Gbps in just the first half of 2016 compared to 223 in all of 2015,” Gary Sockrider, Principal Security Analyst at Arbor Networks, told Help Net Security.

“The complexity of DDoS attacks has also been on the rise. As reported by security researcher, Bruce Schneier, it appears that nation-states have been probing critical pieces of internet infrastructure with extremely complex DDoS attacks. This rise in complexity along with the continued increase in size, scale and frequency of attacks reinforces the need for enterprises to deploy multi-layer DDoS defenses that can detect and mitigate both volumetric and stealthy application-layer attacks,” Sockrider added.

web application attacks keep escalating

The two most popular attack vectors — SQLi and LFI — accounted for nearly 90% of observed web attacks

Web Application Attacks

  • Q2 2016 showed a 14 percent increase in total web application attacks from Q1 2016.
  • Brazil experienced a 197 percent increase in attacks sourced from the region – the top country of origin for all web application attacks.
  • The United States, ranked second among countries for total web application attacks, saw a 13 percent decrease in attacks compared to Q1 2016.
  • SQL Injection (44 percent) and Local File Inclusion (45 percent) were the two most common attack vectors in Q2.

“Web applications remain the most vulnerable entry point for any company and organization, so hackers will also continue exploiting them to get in. I am little bit surprised with the high number of LFI attacks, as they are not as popular as XSS, for example. That can probably be explained by malicious bots and crawlers that are targeted to exploit them rather then less dangerous XSS attacks,” said Ilia Kolochenko, CEO of High-Tech Bridge.

Bot Traffic Analysis

  • During one 24-hour period in Q2, bots accounted for 43 percent of all web traffic across the Akamai Intelligent Platform.
  • Detected automation tools and scraping campaigns represented 63 percent of all bot traffic, a 10 percent increase from Q1 2016. These bots scrape specific websites or industry segments and do not identify their intentions and origin.

To those of us experienced in combating the bot problem its gratifying to hear confirmation of what we already knew, that nearly half the traffic on the Internet is bots, according to Stephen Singam, Managing Director, Security Research at Distil Networks.

“The bot problem has many facets that worry companies, the primary one is which bots are good or bad? Everybody welcomes a good bot (like those from popular search engines) on their site because it helps the business but nobody wants bad bots or advanced persistent bots (APBs) performing nefarious tasks that damage the business. This report states that only 28% are known declared bots, indicating a large amount of infrastructure resource is wasted on daily traffic from undeclared tools and bad bots. In reality bots are indiscriminately scraping content and prices, brute forcing account credentials, running vulnerability scanners, and attempting credit card fraud,” said Singam.

“At a high level, this report is acknowledging the problem but doesn’t go deeper to understand what those bots are doing on a website. It also raises the question of whether Akamai is charging companies for all this unwanted bot traffic?” he concluded.

Don't miss