Korelogic Blog Logo contact
CMIYC 2013 Post-game 2013-08-08 15:15

This is the first of several posts we'll make post-Crack Me If You Can 2013. Later we'll gather things up and add content to the main 2013 contest site.

In this post I'll talk a little about the structural changes we made in this year's DEFCON contest, what we did that we think worked well, some not so well. We'd love feedback that we can use when planning future contests.

Structure

The most obvious change this year was the creation of Pro and Street divisions. Hash types and ratios, encrypted file types, and plaintext creation techniques were generally the same between them, but the plaintexts themselves were different. This was a way to try to have the hardcore teams pushing each other hard, while keeping things fun for smaller teams and more casual players--maybe people at DEFCON who actually want to see some of DEFCON.

The password hashes were then split into 8 different fictitious "CompanyN" subdirectories. Different hash types were spread asymmetrically across the different CompanyN directories. There were relationships between the plaintexts within a given CompanyN directory. We'll do another post later that digs into those in more detail. I am curious if any teams figured out those relationships, or if you all just grouped all NT hashes together into one pile, all DES into another, etc and discarded the relationship between DES and NT hashes for Company1, etc.

The encrypted file challenges had hints about some of the plaintexts for a certain CompanyN. More on those later, too.

Preparation

We were able to do a few things ahead of time that we think helped things go smoothly during the actual event:
  • Pre-registration opened more than a week before the contest start. That let people work out any issues with PGP, dealing with the autoresponder, etc early.
  • Test hashes were published several days before the contest, with a basic automated scoreboard. This also helped teams know what to expect, make sure they were submitting in the correct format, etc, before the pressure was on.
  • Pre-downloads of the hash sets and some of the challenges were available as encrypted archives before the contest start time. This way there was no rush to swamp our bandwidth at the beginning (and I think we did a better job explaining this ahead of time than we did in past years).
All of those helped make this the smoothest CMIYC contest yet.

Issues

We did have some issues at different times, most of which we were able to fix quickly.
  • Scoreboard issues - there were a couple of hash types that were not being counted correctly. The first we found & fixed before anyone noticed, but there was another one (or two?) that we did not catch, until a team contacted us and said "Hey we have cracked N of type XYZ but they aren't appearing", at which point we investigated and fixed it.
  • Submission issues - Despite the early registration and test crack submissions, howto instructions and examples that we try to improve every year, we still had teams forgetting how to submit properly. Sometimes just sloppiness--including hashes, config files, etc, or forgetting to sort -u their lists. Other times it was more involved than that. But, at least the instances of this decrease every year.
  • Late downloads - We had wanted everything to be ready and downloadable ahead of time. But some challenge files weren't ready in time, and we had to release them a bit after the contest started (and scrapped some we had wanted to do, in the interest of time).
  • Few Encrypted File Cracks - Almost no encrypted challenge files were cracked. Either we made them too hard, or teams didn't see the value in dedicating resources towards them, or both. Maybe we should have emphasized the hints more. We'll probably publish them pretty soon (sooner than the plaintexts themselves). I'm curious to hear from the contestants on the encrypted files; did you try a lot but not get far, or not bother?
  • Fewer Password Cracks Than Expected - Overall, fewer plaintexts were cracked than we were anticipating. Maybe that means we made them harder than intended. But, that's better than the opposite; if you all cracked everything in the first 12 hours, the rest of the contest would be a little bit boring.
Coming Up Next

Over the coming weeks we will be collecting writeups from the teams that competed and updating the contest site with them. We'll also do more blog posts that go into more details about certain aspects of the 2013 competition from our perspective.

1 comments Posted by Hank at: 15:15 permalink

grutz wrote at 2013-08-11 18:11:

Thanks for the awesome contest again and for breaking things up into pro/street! In order to maximize points and time I personally grouped all the hashes together, not caring about the companies or their policies:

$ for n in $(find . -type f | cut -f3 -d.); do find -name \*.$num.txt -exec cat {} >> $n.txt \; ; done

This was mostly because I knew I wouldn't be spending that much time on the contest and hoped my wordlists+combos would grant me better luck. When you gave out that Challenge9 was worth 250,000 points I jumped on that in a heartbeat. After getting one of the PFX passwords I just didn't have enough time to take it further.

During our tests we mostly see and crack DES, $1$ MD5 and NTLM hash types. The more slower ones such as OSX 10.8, SunMD5, Blowfish, etc are rare so the contest gives me more experience with these other types in a "real world" scenario as well as letting me test out and tweak our cracking environment.

Looking forward to next year!

Comments are closed for this story.


Please contact us if you would like more information about our services, tools, or careers with us.
Privacy Policy : Copyright 2016. KoreLogic Security. All rights reserved