Machine vs. Machine: Lessons from the First Year of Cyber Grand Challenge
Mike Walker, DARPA
In 2014 DARPA launched the Cyber Grand Challenge: a competition that seeks to create automatic defensive systems capable of reasoning about flaws, formulating patches and deploying them on a network in real time. By acting at machine speed and scale, these technologies may someday overturn today’s attacker-dominated status quo. Just as the first autonomous ground vehicles fielded during DARPA’s 2004 Grand Challenge were not initially ready to take to the highways, the first generation of automated network defense systems will not be able to meaningfully compete against expert analysts or defend production networks. The Cyber Grand Challenge aims to give these groundbreaking prototypes a league of their own, allowing them to compete head-to-head to defend a network of bespoke software.
In this talk, I will describe the experimental setup, measurement, and results of the qualifying year of the DARPA Cyber Grand Challenge. I will cover in depth the DECREE platform, which was developed specifically for the first automated CTF competition, focusing on the experiments it enables, its relevance towards funded research in the computer security domain, and its accessibility to program analysis. On June 3rd, 2015, a 24-hour qualifying event will measure CGC systems in competition and help to select up to seven finalists. I will provide a statistical analysis of the results of the CGC qualifying round, including the visualization of successful automated defensive techniques, successes, failures, rates of efficacy, and other findings backed by post-event reverse engineering. Finally, I will close the talk with some initial thoughts regarding experimentation in the domain of adversarial automation.
Mike Walker joined DARPA as a program manager in January 2013. His research interests include machine reasoning about software in situ and the automation of application security lifecycles. Prior to joining DARPA, Mr. Walker worked in industry as a security software developer, Red Team analyst, enterprise security architect and research lab leader. As part of the Computer Science Corporation "Strikeforce” Red Team, Mr. Walker helped develop the HEAT Vulnerability Scanner and performed Red Team engagements. Serving as a principal at the Intrepidus Group, Mr. Walker worked on Red Teams that tested America's financial and energy infrastructure for security weaknesses. Also, on the DARPA SAFER Red Team, Mr. Walker discovered flaws in prototype communications technologies. Mr. Walker has participated in various roles in numerous applied computer security competitions. He contributed challenges to DEF CON Capture the Flag (CTF) and competed on and helped lead CTF teams at the highest levels of international competition. Mr. Walker was formerly a mentor of the Computer Security Competition Club at Thomas Jefferson High School for Science and Technology (TJHSST).
Open Access Media
USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.
author = {Mike Walker},
title = {Machine vs. Machine: Lessons from the First Year of Cyber Grand Challenge},
year = {2015},
address = {Washington, D.C.},
publisher = {USENIX Association},
month = aug
}
connect with us