The Defense Advanced Research Projects Agency (DARPA) is looking for new program analysis techniques and tools to enable analysts to identify vulnerabilities in algorithms implemented in software used by the US government, military, and economic entities, and has announced it will be accepting research proposals on the subject until October 28.
The Space/Time Analysis for Cybersecurity (STAC) program is particularly geared towards the discovery of vulnerabilities related to the space and time resource usage behavior of algorithms, as well as vulnerabilities stemming from algorithmic complexity and those exploitable for side channel attacks.
“As new defensive technologies make old classes of vulnerability difficult to exploit successfully, adversaries move to new classes of vulnerability,” the agency pointed out in the announcement. “Vulnerabilities based on flawed implementations of algorithms have been popular targets for many years. However, once new defensive technologies make vulnerabilities based on flawed implementations less common and more difficult to exploit, adversaries will turn their attention to vulnerabilities inherent in the algorithms themselves.”
The agency is set on finding those vulnerabilities first.
“Developing new program analysis techniques to find algorithmic resource usage vulnerabilities will not be easy,” they noted. “Analyses will likely need to find paths leading from adversary-controlled inputs to looping structures that could potentially consume crippling amounts of space or time, and paths leading from secrets to outputs whose space or time usage could potentially leak information. Analyses will also have to predict bounds on space and time usage precisely enough to distinguish between cases where denial of service or leaks are possible and cases where they are not.”
They predict that a “perfect” automated tool that can provide answers to non-trivial questions about program behavior will be impossible to develop, but believe that combined semi-automated analyses may be useful.
“The STAC program seeks advances along two main performance axes: scale and speed,” they explained. “Scale refers to the need for analyses that are capable of considering larger pieces of software, from those that implement network services typically in the range of hundreds of thousands of lines of source code to even larger systems comprising millions or tens of millions of lines of code. Speed refers to the need to increase the rate at which human analysts can analyze software with the help of automated tools, from thousands of lines of code per hour to tens of thousands, hundreds of thousands, or millions of lines of code per hour.”