“I shall be telling this with a sigh Somewhere ages and ages hence: Two roads diverged in a wood, and I I took the one less traveled by, And that has made all the difference.” (excerpt from “The Road Not Taken” by Robert Frost) DHS and MITRE had a big announcement yesterday. MITRE has developed a new system for scoring weaknesses in applications, as well as for combining that score with “business value context” to produce a risk estimate. Overall, the work is interesting, though perhaps more from an academic perspective than anything else. What I find interesting is that we’re going back down this road again (“trust” evaluation), which seems like it will inevitably lead to another game-able system.

There’s a common mythos perpetuated by many security vendors (or, at least, by their sales forces) that you can buy a tool, install it, and problems will be solved. This mythos oftentimes short-circuits problem solving processes, jumping to “solutions” without doing earlier steps, such as defining the problem. More often than not we see this sales approach coupled with a heavy dose of FUD, intended to “prove” to a prospective customer that there is a great “risk” (term used incorrectly) that must be mitigated. If you buy their tool, then you’ll be saved! Or not, as the case more likely is…

It’s data breach report day today. Or, so it seems. My brain just ‘sploded on overload from all the fresh tasty stats received. There’s not enough time today to go through everything with a fine-toothed comb. Suffice to say: Data breaches are continuing to happen in growing numbers. Basic security practices still aren’t happening. As painful as it is to admit, it appears that regulations like PCI DSS are having a positive impact. Our codebase still leaves much to be desired, though there is reason to be a bit optimistic. That said, here’s the goods: Verizon Business 2011 Data Breach Investigation Report Veracode 2011 “State of Software Security” Report Ponemon 2011 PCI DSS Compliance Trends Study Incidentally, if you take[…]

In light of the recent Epsilon data breach, it seems appropriate to chat briefly about the realities of balancing information risk. First and foremost, we need to make sure that we understand this thing called “risk.” In our context, risk is defined as “the probable frequency and probable magnitude of future loss” (based on Jack Jones’ FAIR definition). Put into practical terms, risk is the likelihood that we’ll experience a negative event. We then balance that out against the cost of defending against various scenarios (i.e., trying to reduce or transfer that risk), with the goal being to optimize cost vs. benefit. Let’s look at a couple practical examples.

Unless you’ve been living under a rock for the past week, then you undoubtedly know that Japan was rocked a few days ago by an 8.9 magnitude earthquake (the 3rd largest in the past decade and top 10 overall – also check out the NYT’s before & after shots) and a subsequent tsunami that exponentially compounded the ill effects of the disaster. Coming out of that incident, one of the most hyped “news” items has been the aftermath at the Fukushima nuclear power generation facility. It turns out (unsurprisingly) that much of this coverage has been faulty, inappropriately throwing around talk of “melt downs” when, in fact, things are under control. For a great, detailed description of the entire incident,[…]

News flash: Those so-called “risk” labels/ratings included in pentest and vuln scan reports are NOT actually “risk” representations. I was in attendance at the OWASP Summit 2011 a couple weeks back, and the topic of “risk metrics” and labels came up during one session. As a result, I led a break-out session on what risk really looks like in the macro sense, in accordance with formal methods, and where these various scan/test results really fit in. The session had great conversation and highlighted for me a need to expand risk analysis training to a broader audience. Below is a picture of the taxonomy of factors that make up a FAIR (Factor Analysis of Information Risk) risk analysis. Putting aside the[…]