How do I register for the contest?
Complete the registration form here . You will recieve an automated reply confirming your registration. Design artifacts will be sent out Friday around 10AM PDT.
What is the submission deadline?
Results/findings must be submitted by Saturday August 13th at 6PM PDT to be considered as part of the contest. Late submissions may be considered at the discretion of the judges.
How do I submit results?
Go to our submission page to view details to submit results.
How should results be structured?
We request that all submissions to be judged will be in the JSON format described in this FAQ (see Example JSON format and JSON schema , below). This allows us to partially automate the submission intake process, maximizing the number of submissions we’re able to handle.
If you create any supporting files in order to produce your list of findings, we encourage you to submit those as well. Any such supporting files may be considered by the judges when reviewing your submission.
You are welcome to submit non-JSON formatted results, but they might not be accepted by the judges. We only commit to judging results in the defined format. If we are able, we may judge other submissions in other formats. This will depend greatly on the number of submissions received and number of threats within those submissions requiring review.
If you submit findings in the requested JSON format but you structure individual findings differently before putting into our JSON format, please submit your raw format as well. We would love to collect differences between modelers and share our learnings with the greater Threat Modeling community.
What is the JSON format?
See below for an example of a properly formatted submission, and a JSON schema for submissions. Only the Threat Name/Summary and Threat Description are required Fields. The schema allows for any number of custom fields in each threat.
You can use a site like https://jsonformatter.curiousconcept.com/ to validate your JSON before submitting.
Example JSON Format
What are these
We don’t know - you tell us ;) The contest includes 2 parts - identifying threats and documenting them.
CustomFields will allow you to document the threat however you want, using whatever terminology you would like to use. If you use
CustomFields we ask you provide the judges some details to help them interpret your findings as you intend.
Name to define a name for your field, use
FieldDescription to define what this field is and how you will use it and use
Value to document the value you are assigning the
CustomFields in relation to the Threat.
Can you give examples of a CustomField?
We could give you some explicit examples, but then everyone will include them and this would defeat some of the objectives for this event.
How will submissions be judged?
Submissions will be judged on overall quality, using the following criteria:
Things that we really care about (but not limited to):
- Good documentation of threats.
- The total count of plausible threats.
- Results that are more actionable for a development team.
- In order to be more actionable you may need to be creative and add Custom Fields to your threats to include additional data.
- Identification of discrepancies between diagrams and their associated flow descriptions.
Things to avoid:
- Duplicate findings in everything but location. For example, if you discover a vulnerability that applies to multiple locations in the system, cite all the locations as a single finding; repeating a finding over and over will make the judges’ job more difficult.
- Duplicating the same finding multiple times throughout the document, possibly with slightly different words.
- Findings that require a particular tool to be able to read (e.g a 100,000 line file that can only be read by a particular application that the judges might not have); these are unlikely to be evaluated by the judges and may invalidate your submission.
The goal of this event is to test your threat modeling skills - for you to identify threats in the provided design and document them using the recommended format. While there are tools to model applications and identify threats, using these is against the spirit of the event. We want to see how you have internalized the design and what threats you are able to identify. Judges discretion on how to handle these submissions.
Wow you are being really vague about judging.
That is by design (pun intended!). Threat modeling enables engineering team members - software engineers, quality assurance, managers, and customer support - to make informed decisions about their system’s security and privacy. The better your submissions would do this, the more likely you will be to win.
We want to see different threat modelers’ approaches, what assumptions they make, and how they structure their results for development teams.
How many contestants are allowed to register/submit results?
We are not putting any restrictions on the number of registrations or submissions.
Our ability to process all submissions will depend greatly on the number of results we receive, number of threats in those results, how closely to the contest deadline results are submitted, etc. If we receive a small number of results this is not likely to be an issue, but if we receive a large number of results we may need to be more judicious when reviewing all valid submissions before results are due to DEF CON Contest & Events (e.g. only reviewing well-documented and easy-to-understand submissions following the recommended format).