RPE-018: Binary Signature Diversity


RPE-018: Binary Signature Diversity

Date: June 28-29, 2022 | Location: Virtual or DreamPort Facility in Columbia, MD

And the winners are…

BigBear AI from the blue team.

BigBear AI Logo

Heilig Defense from the red team.

Heilig Defense Logo

Congratulations for winning this RPE!


Though prior events in 2018 (RPE-001, "The Chameleon and the Snake") and 2019 (RPE-005, "The Chameleons and the Snakes") were successful, binary signature diversity remains a constant challenge. Worldwide, attackers and defenders play a cat-and-mouse game of altering signatures to avoid detection and detecting those altered signatures.

This event focuses on tools and techniques for signature diversity and signature measurement across Microsoft Windows (x86 and amd64), Linux (x86 and amd64), OS X (amd64), and Android (x86 and ARM) platforms. Samples can include both binaries and scripts (e.g., JavaScript). Participants will play either offense or defense.

This RPE directly addresses the following U.S. Cyber Command technical challenge problems:

  • 2020 1.6 CHALLENGE PROBLEM: Polymorphic Malware/Countering Adversarial Signature Diversity
  • 2020 1.10 CHALLENGE PROBLEM: Survivability/Implementing Signature Diversity for Offense

Event Flow: Offense

Once registration closes, all offensive teams will be given a library of malware samples spanning the platforms above. This library will consist mostly of binaries/scripts in the public domain, but some novel "malware" (developed by DreamPort) will be provided in binary and source code form. Expect this library to comprise a couple dozen different malware families in total.

During the event, offensive teams will be challenged to:

  • Create a tool (through integration, enhancement, or from scratch) that, given a sample, can generate modified samples that exhibit on-disk and in-memory signature diversity without modifying functionality.
  • Use this tool to generate two modified samples for each round of the competition. Offensive teams are free to choose from any of the samples provided prior to the event, including the novel "malware" provided in binary and source code form.

Binaries for selected public domain malware, and both source code and binaries for DreamPort-created "malware", are provided to all offense teams to allow teams to demonstrate both binary and source code signature diversity techniques. Both types of techniques are interesting, but teams are not required to demonstrate both.

Event Flow: Defense

Once registration closes, defensive teams will be provided with at least one binary sample of each novel "malware" developed by DreamPort and given to the offense. Defensive teams are NOT provided with the public domain malware samples given to the offense.

During the event, for each of the one-hour rounds of the competition, defensive teams will be provided with a "security alert" containing a random mix of:

  • modified samples generated by offensive teams
  • public domain malware samples (known hash)
  • slightly-permuted public domain malware samples (unknown hash)
  • benign binaries (known hash).

Defensive teams are challenged to create (through integration, enhancement, or from scratch) a single tool for the fully automated classification of samples as follows:

  • Identify samples as malicious or benign
  • Attribute each malicious sample as a variant of a known malware sample
  • Attribute each malicious sample to a known malware family
  • Attribute multiple malicious samples to an offensive team (i.e., offensive teams' diversity tools may introduce their own signature, tying together original samples spanning different malware families)

Defensive teams are asked NOT to submit any competition samples to third-party Internet services, e.g., VirusTotal. However, searching the Internet for hash values is completely acceptable. Ideal solutions will be capable of operation on isolated networks, i.e., no dependencies on public cloud services.

Remote or In-Person Participation

Teams can participate in-person at the DreamPort facility in Columbia or virtually. In-person teams must bring their own equipment – DreamPort will provide Internet access via WiFi. Teams may not have more than five people on site at DreamPort, and on-site participants must be identified in advance. Lunch and light refreshments will be provided for in-person attendees.


Participants will be evaluated by the type of team they are bringing (or joining): offense or defense. A separate list of scores will be maintained for each side.

Defenders must submit an analysis report for each sample detailing answers to these questions:

  • Is the sample malicious or benign?
  • Is the sample a variant of a known malware sample?
  • Is the sample a variant of a known malware family?
  • Can the sample be attributed to other samples in previous competition rounds?

Offensive teams will be evaluated by:

  • Can the defense attribute your modified sample to the original sample?
  • Can the defense attribute your modified sample to its malware family?
  • Is your modified sample excessively larger than the original sample (defined as more than 2.5x the size of the original sample)?
  • Do you use a known technique for alteration? Did you invent your own?
  • Does your modified sample require elevated privileges to invoke (where the original did not)?
  • Does your modified sample require a separate process/approach to invoke?
  • Does your modified sample run without any additional steps (e.g., unpacking)?
  • Does your modified sample have a different in-memory signature than the original sample?
  • Can the defense correctly identify your sample as malicious?
  • Does your tool produce a unique hash for each run against a given input?
  • Can your modified samples be attributed to the same actor, i.e., does your tool have its own signature?
  • Did your team demonstrate both binary and source code modification techniques?

Machine Learning

As in RPE-005, consideration will be given to any defensive team who can prove their use of machine learning (in a one-on-one interview) for the purposes of identification, classification, or attribution of submitted samples. We will not require teams to reveal details about their algorithms or features – just provide a demonstration.

Suggested Skills

This RPE requires participants to have at least intermediate-level skills in:

  • Python
  • C/C++
  • Java
  • ELF, PE, DEX/ODEX, JAR file format knowledge
  • Malware Analysis
    • Automated Analysis Systems (CRITs, Cuckoo)
    • Yara
    • Sandbox Execution
    • Reverse Engineering (IDA Pro, Binary Ninja, Ghidra, Radare)

There are two distinct technology paths that we are searching for here. For the offense, we are interested in a single technology that can alter the signature of multiple executable formats (e.g., ELF, PE) without altering functionality. For the defense, we are interested in an extensible, automated solution that can perform automated classification, at least triage-level analysis, and attribution for suspicious files or artifacts. We are especially interested defensive solutions which can extend their behavior with minimal design changes (e.g., plugins, scripts).


We have chosen the following schedule of events:

  • 2 May 2022: RPE Announced Publicly
  • 17 June: RPE Registration Closes
  • 22 June: RPE Virtual Q&A Session
  • 28 – 29 June: RPE Main Event


DreamPort will award the top offensive team and the top defensive team $12,500 each.