Leaving the malware world for a moment, the text editor I’m using (gedit, the GNOME text editor) includes gedit.c with 295 LOC—and gedit.c is only one of 128 total source files (along with 3 more directories) published in the GNOME GIT source code repository for gedit.[] Counting all 128 files and 3 directories yields 70,484 LOC. The ratio of legitimate application LOC to malware is over 500 to 1. Compared to a fairly straightforward tool like a text editor, an average malware sample seems very efficient!
Mudge’s 125 LOC number seemed a little low to me, because different definitions of “malware” exist. Many malicious applications exist as “suites,” with many functions and infrastructure elements. To capture this sort of malware, I counted what you could reasonably consider to be the “source” elements of the Zeus Trojan (.cpp, .obj, .h, etc.) and counted 253,774 LOC. When comparing a program like Zeus to one of Mudge’s average samples, we now see a ratio of over 2,000 to 1.
Mudge then compared malware LOC with counts for security products meant to intercept and defeat malicious software. He cited 10 million as his estimate for the LOC found in modern defensive products. To make the math easier, I imagine there are products with at least 12.5 million lines of code, bringing the ratio of offensive LOC to defensive LOC into the 100,000 to 1 level. In other words, for every 1 LOC of offensive firepower, defenders write 100,000 LOC of defensive bastion.
Mudge also compared malware LOC to the operating systems those malware samples are built to subvert. Analysts estimate Windows XP to be built from 45 million LOC, and no one knows how many LOC built Windows 7. Mudge cited 150 million as a count for modern operating systems, presumably thinking of the latest versions of Windows. Let’s revise that downward to 125 million to simplify the math, and we have a 1 million to 1 ratio for size of the target operating system to size of the malicious weapon capable of abusing it.
Let’s stop to summarize the perspective our LOC counting exercise has produced:
120:1. Stuxnet to average malware
500:1. Simple text editor to average malware
2,000:1. Malware suite to average malware
100,000:1. Defensive tool to average malware
1,000,000:1. Target operating system to average malware
From a defender’s point of view, the ratios of defensive tools and target operating systems to average malware samples seem fairly bleak. Even swapping the malware suite size for the average size doesn’t appear to improve the defender’s situation very much! It looks like defenders (and their vendors) expend a lot of effort producing thousands of LOC, only to see it brutalized by nifty, nimble intruders sporting far fewer LOC.
What’s a defender to do? The answer is to take a page out of the playbook used by any leader who is outgunned—redefine an “obstacle” as an “opportunity”! Forget about the size of the defensive tools and target operating systems—there’s not a whole lot you can do about them. Rejoice in the fact that malware samples are as small (relatively speaking) as they are.
Imagine trying to understand how a defensive tool works at the source code level, where those 12.5 million LOC are waiting. That’s a daunting task, although some researchers assign themselves such pet projects. For one incredible example, read “Sophail: A Critical Analysis of Sophos Antivirus” by Tavis Ormandy,[] also presented at Black Hat Las Vegas in 2011. This sort of mammoth analysis is the exception and not the rule.
Instead of worrying about millions of LOC (or hundreds or tens of thousands), settle into the area of one thousand or less—the place where a significant portion of the world’s malware can be found. As a defender, your primary goal with respect to malware is to determine what it does, how it manifests in your environment, and what to do about it. When dealing with reasonably sized samples and the right skills, you have a chance to answer these questions and thereby reduce the risk to your enterprise.
If the malware authors are ready to provide the samples, the authors of the book you’re reading are here to provide the skills. Practical Malware Analysis is the sort of book I think every malware analyst should keep handy. If you’re a beginner, you’re going to read the introductory, hands-on material you need to enter the fight. If you’re an intermediate practitioner, it will take you to the next level. If you’re an advanced engineer, you’ll find those extra gems to push you even higher—and you’ll be able to say “read this fine manual” when asked questions by those whom you mentor.
Practical Malware Analysis is really two books in one—first, it’s a text showing readers how to analyze modern malware. You could have bought the book for that reason alone and benefited greatly from its instruction. However, the authors decided to go the extra mile and essentially write a second book. This additional tome could have been called Applied Malware Analysis, and it consists of the exercises, short answers, and detailed investigations presented at the end of each chapter and in . The authors also wrote all the malware they use for examples, ensuring a rich yet safe environment for learning.
Therefore, rather than despair at the apparent asymmetries facing digital defenders, be glad that the malware in question takes the form it currently does. Armed with books like Practical Malware Analysis, you’ll have the edge you need to better detect and respond to intrusions in your enterprise or that of your clients. The authors are experts in these realms, and you will find advice extracted from the front lines, not theorized in an isolated research lab. Enjoy reading this book and know that every piece of malware you reverse-engineer and scrutinize raises the opponent’s costs by exposing his dark arts to the sunlight of knowledge.
Richard Bejtlich (@taosecurity)
Chief Security Officer, Mandiant and Founder of TaoSecurity
Manassas Park, Virginia
January 2, 2012
[]
[]
[]