“I shall be telling this with a sigh
Somewhere ages and ages hence:
Two roads diverged in a wood, and I
I took the one less traveled by,
And that has made all the difference.”
(excerpt from “The Road Not Taken” by Robert Frost)

DHS and MITRE had a big announcement yesterday. MITRE has developed a new system for scoring weaknesses in applications, as well as for combining that score with “business value context” to produce a risk estimate. Overall, the work is interesting, though perhaps more from an academic perspective than anything else. What I find interesting is that we’re going back down this road again (“trust” evaluation), which seems like it will inevitably lead to another game-able system.

(more…)

Post to Twitter Post to Facebook

This entry was posted on Tuesday, June 28th, 2011 at 10:08 am by Ben Tomhave and is filed under news, standards.

 

There’s a common mythos perpetuated by many security vendors (or, at least, by their sales forces) that you can buy a tool, install it, and problems will be solved. This mythos oftentimes short-circuits problem solving processes, jumping to “solutions” without doing earlier steps, such as defining the problem. More often than not we see this sales approach coupled with a heavy dose of FUD, intended to “prove” to a prospective customer that there is a great “risk” (term used incorrectly) that must be mitigated. If you buy their tool, then you’ll be saved! Or not, as the case more likely is…

(more…)

Post to Twitter Post to Facebook

This entry was posted on Thursday, May 26th, 2011 at 11:03 am by Ben Tomhave and is filed under Technology & Tool Thursday.

 

It’s data breach report day today. Or, so it seems. My brain just ‘sploded on overload from all the fresh tasty stats received. There’s not enough time today to go through everything with a fine-toothed comb. Suffice to say:

  • Data breaches are continuing to happen in growing numbers.
  • Basic security practices still aren’t happening.
  • As painful as it is to admit, it appears that regulations like PCI DSS are having a positive impact.
  • Our codebase still leaves much to be desired, though there is reason to be a bit optimistic.

That said, here’s the goods:

  1. Verizon Business 2011 Data Breach Investigation Report
  2. Veracode 2011 “State of Software Security” Report
  3. Ponemon 2011 PCI DSS Compliance Trends Study

Incidentally, if you take the combined results of these studies, one of the key takeaways ties in very nicely with this quote from the current Cloud Security Alliance (CSA) v2.1 Security Guidance: “A portion of the cost savings obtained by Cloud Computing services must be invested into increased scrutiny of the security capabilities of the provider, application of security controls, and ongoing detailed assessments and audits, to ensure requirements are continuously met.” (h/t Gunnar Peterson)

Post to Twitter Post to Facebook

This entry was posted on Tuesday, April 19th, 2011 at 1:02 pm by Ben Tomhave and is filed under Tutorial Tuesday.

 

In light of the recent Epsilon data breach, it seems appropriate to chat briefly about the realities of balancing information risk. First and foremost, we need to make sure that we understand this thing called “risk.” In our context, risk is defined as “the probable frequency and probable magnitude of future loss” (based on Jack Jones’ FAIR definition). Put into practical terms, risk is the likelihood that we’ll experience a negative event. We then balance that out against the cost of defending against various scenarios (i.e., trying to reduce or transfer that risk), with the goal being to optimize cost vs. benefit. Let’s look at a couple practical examples.

(more…)

Post to Twitter Post to Facebook

This entry was posted on Tuesday, April 5th, 2011 at 4:19 pm by Ben Tomhave and is filed under risk management, Tutorial Tuesday.

 

Unless you’ve been living under a rock for the past week, then you undoubtedly know that Japan was rocked a few days ago by an 8.9 magnitude earthquake (the 3rd largest in the past decade and top 10 overall – also check out the NYT’s before & after shots) and a subsequent tsunami that exponentially compounded the ill effects of the disaster. Coming out of that incident, one of the most hyped “news” items has been the aftermath at the Fukushima nuclear power generation facility. It turns out (unsurprisingly) that much of this coverage has been faulty, inappropriately throwing around talk of “melt downs” when, in fact, things are under control.

For a great, detailed description of the entire incident, check out Barry Brook’s post “Fukushima Nuclear Accident – a simple and accurate explanation” over on the Brave New Climate blog. It’s an excellent discussion of the accident, which highlights several poignant points that can be directly applied to information security and information risk management (also see this post, which dispels one inaccuracy in Brook’s post – that there is not, in fact, a “core catcher” installed – and provides even greater assurance that things are well in-hand).

Specifically, there are 5 take-away points to consider:
(more…)

Post to Twitter Post to Facebook

This entry was posted on Monday, March 14th, 2011 at 3:16 pm by Ben Tomhave and is filed under news.

 

News flash: Those so-called “risk” labels/ratings included in pentest and vuln scan reports are NOT actually “risk” representations.

I was in attendance at the OWASP Summit 2011 a couple weeks back, and the topic of “risk metrics” and labels came up during one session. As a result, I led a break-out session on what risk really looks like in the macro sense, in accordance with formal methods, and where these various scan/test results really fit in. The session had great conversation and highlighted for me a need to expand risk analysis training to a broader audience.

Below is a picture of the taxonomy of factors that make up a FAIR (Factor Analysis of Information Risk) risk analysis. Putting aside the discussion of how one generates the value ranges that go into each factor, let’s look at where pentest/scan results fall. Looking at the taxonomy below, note that there are two key halves: Loss Frequency and Loss Magnitude. As you peruse the factors that roll up to those halves, think about where your pentest/scan results might fit.

FAIR Taxonomy

In order to properly estimate “risk” based on the results of a pentest or vuln scan, you need to understand the business impact in a number of structured scenarios. Simply understanding the Loss Frequency is not sufficient to estimate risk. However, that said, there is certainly valuable IF you process those findings accordingly. The main factor reflected by these test results will be “Resistance Strength,” which is to say that you will now have a better understanding of how much effort would be required to compromise your organization (given: compromise given enough time and resources is inevitable).

The next time you get a report from a vendor that talks about “risk,” please challenge their assertion. Ask them to explain how they estimated the financial impact of various weaknesses on your organization. Unless they interviewed your business management team to understand how a weakness could impact the business if exploited, then I submit that they’re not providing you with an actual risk rate. Instead, what you’re getting a snapshot reading of your Resistance Strength in a given context, as well as receiving some hints on what opposing Threat Capability might be. You’re still left needing to estimate the other factors under Loss Frequency, and you almost certainly need to fill in the blanks under Loss Magnitude (something that can be accomplished independently by developing loss tables for key systems, and then either providing those to 3rd parties performing the assessments, or combining them with the results on your own).

As an aside it’s worth noting that some 2nd generation “GRC” platforms are starting to integrate risk analysis capabilities like FAIR that can be leveraged to merge in scan/pentest results and to generate a reasonable risk analysis.

Post to Twitter Post to Facebook

This entry was posted on Tuesday, February 22nd, 2011 at 3:16 pm by Ben Tomhave and is filed under Tutorial Tuesday.