What We Learned from Event RPE-001: The Chameleon and the Snake

What We Learned

RPE-001: The Chameleon and the Snake


Overview

This RPE specifically targeted malware signature diversity and signature measurement for Microsoft Windows (x86 and amd64) in a simulated operational environment at a realistic pace. We challenged the participants to:

  1. As an 'attacker', create (through integration, enhancement or from scratch) a single tool for altering the signature of an operational tool for Microsoft Windows without changing functionality
  2. As a defender, create (through integration, enhancement or from scratch) a single tool for the fully automated classification of an unknown Windows Executable as malware/benign, variant of known sample, attributed to known group (based on previously established knowledge)

This RPE was treated as a force-on-force exercise where each side has a specific goal:

  • DEFENSE – The 'Good Guys' included a series of 3-5 computer network defense analysts, malware analysts, and live forensic experts who specialized in Microsoft Windows. These guys were no strangers to programming, and possessed a deep understanding of C#, C/C++ and Python/Ruby. They also had experience with all of the traditional malware analysis tools such as Cuckoo Sandbox, Wireshark, ssdeep, and Capstone Disassembler.
  • OFFENSE – A fictitious hacker collective comprised of 3-5 individuals who specialized in malware and the Microsoft Portable Executable Format, Windows API, malware techniques and communication.

Each side could have selected a leader, but we explicitly left this decision up to the participants. DreamPort personnel provided exercise control (EXCON), operating in a manner to ensure each side got the information and input they needed, and evaluated the outcome without unduly influencing either side. We suggested that this RPE exclude commercial products but welcomed commercial participants, as long as they didn't violate company NDA or Acceptable Use Policies using trade secrets or patented technologies.

This RPE was entirely unclassified. All tools and techniques used were also unclassified. Engagement details, teaming, equipment, scoring, and success criteria details were provided upon registration for the event.

Observations

  1. Participants were surprised by the level of evaluation they were subjected to. Our plan was to maintain a similar (but not equivalent) level of evaluation for all activities. We wanted to establish a level of excellence while not excluding parties.
  2. There was too much downtime for one side versus another in this engagement. This needs to be addressed so each side (in an offense vs. defense scenario) is active as much as possible.
  3. No participant really understood the purpose of the published malware prior to the start of the event. We recognize we could have been more forthcoming with the purpose and intent but wanted to maintain neutrality. We felt like informing participants to train models and data files on these samples was 'giving away the answer'.
  4. There should be a 'new to the business' section of an RPE as much as possible (e.g. Team 4). Possibly the most effective way to build brand is to get people who are interested in the field involved and scored but maybe with a 'handicap'.
  5. Three days of that many hours was too much. This only worked if people were willing to code while onsite. We should have a category system or rating system for LoE in the future to communicate expected LoE to prospective participants.
  6. A junior team trained on scoring should be present at future events.
  7. A script describing what to expect and when should be provided to participants at future events.

Lessons Learned and Takeaways

  • 100% of participants were either satisfied or very satisfied with the venue.
  • 91% of participants are likely or very likely to participate in future RPEs.
  • 100% of participants reported that they learned something new in this RPE.
  • 37% of participants reported that they invented something new in this RPE.
  • 55% of participants reported that they created a new capability or enhanced functionality during the RPE.
  • 6% of participants reported that they have a tool or capability used during the RPE they would like to present to USCYBERCOM.
  • 100% of participants reported they had fun throughout the RPE.
  • Participants asked that we provide more details on the scoring metrics and weights of the objectives, and scoreboards to keep track of team progress.

Results

  • Offense Winner: DRAPER
  • Defense Winner: Northrop Grumman XETRON
  • Honorable Mention: CrowdStrike

Congratulations to all of the participants!

Special Thanks To

UMBC Training Centers is a premier provider of professional and technical training for individuals, businesses, non-profit organizations and government agencies. UMBC Training Centers is a part of the University of Maryland, Baltimore County (UMBC) and is organized as a not-for-profit organization owned by UMBC.

UMBC Training Centers programs are non-credit and UMBC Training Centers students do not require academic admissions to UMBC. Learn more at http://www.umbctraining.com.

Federal Business Council, Inc. (FBC) specializes in producing conferences and trade show events at Federal Government locations throughout the United States. Each month thousands of federal employees attend FBC events to evaluate the latest advances in technology, military hardware, training, and other product areas, as well as update their sources for future requirements.

Founded in 1976, FBC has conducted more than 4,000 on-site expositions and conferences for the Department of Defense, Intelligence Community, and civilian agencies. Over the last 40 years, FBC has become a comprehensive resource for marketing to the Federal Government. To learn more, visit https://www.fbcinc.com/.

Scores & Photos

See updated scores here!

Check out photos from the RPE-001 Social here!

Questions?

For any RPE-001 questions or concerns, please contact us.

Participants

CrowdStrike
CyberPoint International
Draper
DUKE University
Eccalon
Northrop Grumman XETRON
Palo Alto Networks
RedHat