During security assessments, I always make sure they’re performing security testing as part of their development process.

This is why: “Apple security blunder exposes Lion login passwords in clear text”

No need to go into details as to what happened here; it’s well-researched in the linked article. However, this is exactly the scenario that development security testing is meant to avoid. A seemingly innocent patch disables or circumvents an important security feature. The results are predictable.

It could be worse, though. Here’s the worst case: the problem isn’t detected. Because the security was included in the original version, and because nobody checked, it is assumed that the security is in place, and successive updates are made, with the security feature in question not working, but everyone assuming it does. And successive patches are built upon the circumvented security. By the time the bug is discovered, fixing it is a gargantuan task.

So, it’s not that bad. It’s still a major breach, though. So if you ever wonder if that testing is really necessary during development, you can point to this incident and confidently say “Yes.”

This entry was posted on Monday, May 7th, 2012 at 5:17 pm by Benjamin Hartley and is filed under data protection, risk management, software.


I’ve had some interesting experiences with two companies recently that I’d like to share. We all do business with companies online: we buy from them, we schedule appointments, we put in support requests, and so on. Today, I very seldom use the mail, and don’t shop in person very often. How these businesses treat customer security is interesting. Some places are very technically savvy and have robust, secure online transactions. Being realistic, though, I know that my dentist’s office does not employ a full-time sysadmin. They buy an off-the-shelf customer care solution and hire someone to install it on their website. Sometimes that’s good, sometimes that’s bad…

First was with my mechanic. I like my mechanic – they’ve saved me quite a bit in the past. But they’re notoriously bad about answering the phone. However, they are surprisingly up to date for such a shop. They have a website which allows you to schedule your appointments online, no need to call. That’s great!
Necessarily, this means you need to have an account on the website. Okay, this makes sense: they track your name, contact information, kind of car you have, and the car’s maintenance history including mileage. While nothing there is particularly incriminating or dangerous unto itself, it’s not the sort of information I’d like to have broadcast to the world, either. So it’s good that this information is kept in an individual account not available to others.
However, I admit that I couldn’t recall my password for that account. No problem, I put in the username and requested a password reset. The automated tool asked for my email, which I gave, and it sent me a new password.
Do you see the problem? It wasn’t asking me for my email address to confirm that I should be the recipient of that password. It was asking in order to know where to send the new password. There was no confirmation process; it just sent the password to the address I’d provided.
And that’s how I got into someone else’s account. My first clue was that I don’t own a Mitsubishi. No harm done – I didn’t even get the person’s contact information, I simply figured out my correct account name (I was off by one letter) and logged in properly. But that’s no security at all.

On the flip side, I wanted to get support for a piece of electronics I bought recently. I was looking for a driver for it, and couldn’t find anything, so I thought I’d go ahead and contact their support team. In theory this should be a straightforward enough thing. In practice, not so much. You have to open an account with the manufacturer. For which you need to own an actual product. Now, that’s a bit of an issue – what if I was looking at buying a product and wanted to know beforehand if the driver existed? But I already owned this item. So I went to open the account (and set up to handle all the forthcoming spam, I’m sure.) Part of the process involves saying just which device you own. Now, the item I had wasn’t listed. I made the best match, a similar item with a different model number. Shouldn’t make a difference, right?
Oh, but it does. The item I selected is listed, for some reason, as out of warranty. And on that I was frustrated – I cannot make inquiries about an item which is out of warranty.
I’m sure this system reduces needless support requests. In this case it also prevented a real request; I won’t be buying this company’s products in the future.

What can we learn here? Well, two companies, two lessons.

In the first case, make sure your system applies basic security. My mechanic has relatively trivial information on me, sure, but they have some information, and they’re not securing it well enough. The idea of a confirmation before resetting a password has been “best practice” for longer than I can remember. If you’re going to bother having individual user accounts, there’s no excuse to not treat them with at least some security.

In the second case, your security shouldn’t get in the way of your business. Sure it’d be nice to be able to make sure every single contact was authenticated and properly routed, but if you have any reason to deal with the public that’s just not going to happen.

The overall lesson is that even if you’re a small company, your security has to match your needs. An off-the-shelf solution without any thought behind its application won’t do you any good.

This entry was posted on Friday, April 20th, 2012 at 1:23 pm by Benjamin Hartley and is filed under passwords, privacy, rants, risk management.


A service provider shall not be liable for monetary relief, or, except as provided in subsection (j), for injunctive or other equitable relief, for infringement of copyright by reason of the provider’s transmitting, routing, or providing connections for, material through a system or network controlled or operated by or for the service provider, or by reason of the intermediate and transient storage of that material in the course of such transmitting, routing, or providing connections – 17 USC § 512, from the Cornell University Law School

No business is an island. There’s no company that does not, to some extent, rely on other businesses. Business models assume that vendors will be able to assure a steady flow of goods, that retailers will sell goods and pay as contractually bound, that shippers will actually ship goods, etc. Our legal system is filled with assurances to that effect. And this is important, because it gives companies confidence to make such agreements. Knowing that business partners can in fact be bound and trusted to perform their duties, companies can more readily act to grow and increase their revenues. The key component here is confidence – a certainty that once a contract is signed, it will be followed.

That’s what makes the MegaUpload case rather disturbing. There’s no doubt that MegaUpload was hosting infringing content. However, the content was not all infringing – but all of it was taken down. Right now there is a case and the hosting company, Carpathia, is seeking court action which would allow it to release the existing data back to MegaUpload users.

However, in a way, the damage has already been done. Whatever the outcome of the case itself, one message has been sent clearly: your data can be held hostage by others’ data. That’s sure to have a chilling effect on the hosting industry for years to come.

This entry was posted on Thursday, April 12th, 2012 at 5:40 pm by Benjamin Hartley and is filed under data protection, rants, regulations, risk management, vendors.


Suppose you want to send a letter to your brother. And let’s suppose it’s got some, oh, maybe potentially embarrassing financial information – he owes you some money and you’re having trouble paying the bills.

Obviously, that’s not the sort of thing you want to put on a postcard; you’d put that in an envelope. (Your brother is notorious about checking his email).

You want him to know that the letter is actually from you, so you sign it – you have a distinct signature that is very hard to forge. And, on top of that, you want him to know that nobody else read the letter, so you also sign across the fold of the envelope, so it can’t just be put in a new envelope.

So, you’ve done the basic security – it’s authenticated (with your signature), it’s not readable by third parties (because of the envelope) and it’s tamper-evident (because you signed the envelope, too). It’s not the most secure communication possible, but you’ve clearly done due diligence.

So what if I told you people were doing that almost 4000 years ago?

Sealing letters in clay envelopes was standard practice. Sometimes it was used for security; other times, in the case of contracts, the contract was written on the inner tablet and the envelope, and both marked with the personal seals of the signatories, making the text of the contract accessible while still having an unalterable copy in case it came into question.

People have known for millennia that secure communication is crucial to business. We’ve known a need for privacy, authentication, and tamper evidence. These aren’t new ideas at all.
However, we seem to have a hard time applying them to modern technology, sadly. That’s the only reason I can figure out to explain why yesterday I had someone asking me to email a scanned image of a check without any encryption.

This entry was posted on Thursday, April 5th, 2012 at 5:05 pm by Benjamin Hartley and is filed under privacy.


A few years back, I was working as a tech writer for a company which made medical software. We were trying to get an important certification that we’d need to sell our product. And a crucial part of that was good documentation: we had to show how it worked, what it did, how it tracked everything, how it was secure, etc. Well, that’s what you have a tech writer for, so all is good.

It’s important to know, I didn’t have any existing documentation to work with. There was a wiki which had the developers’ notes in it, but that’s it. Nothing by way of formal hand-it-to-an-outside-entity documentation.

Okay, that’s not too abnormal; tech writing is expensive, and many companies don’t bother with it until an auditor is breathing down their neck. Hardly ideal, but to be expected, and I did have time. So, I set to it.

Since there wasn’t any existing documentation to re-do, I based my organization around the expectations set by the certification. And, a good week before the deadline, I turned in the completed documentation, all 100-something pages of it.

And that’s when disaster struck. The auditors decided they wanted the documentation in a completely different format – they weren’t going to read our documentation, no. They wanted us to fill out a questionnaire. The questionnaire was very comprehensive, encompassing exactly as much material as my documentation covered. And I had less than a week to complete it. I told my boss “No problem.” And I gave him the completed questionnaire in 3 days.


This entry was posted on Tuesday, March 27th, 2012 at 6:29 pm by Benjamin Hartley and is filed under rants, Tutorial Tuesday, Uncategorized.


I recently had the pleasure of performing one of the best security assessments I’ve ever done. It was great: I didn’t find any gaps. Not a one.

To some people, it might come as a surprise that I’d consider that a good assessment. And I’ll admit, it made me a bit suspicious. Nothing? Seriously? Well, I had to look into why, and I’ll get to that in a moment. But let’s cover something else first.

I’ve been on both sides of the table for security audits. Being audited is Not Fun. You have someone coming in, looking over all your processes, and it’s up to you to prove that you’re doing what you’re actually doing, often for reasons that seem terribly arcane or pointless. And the management directive is almost always “make sure we pass this” which is assuredly not the same thing as “make sure we are actually secure.” It’s a very adversarial relationship.

As the auditor, you’re always looking for the places where they’re trying to hoodwink you, trying to gloss over something, or just outright lying. You’re always suspicious. If you’re not when you start, you will be. Because the people you’re auditing don’t want to be secure – they want to pass the audit. Which is understandable – failure can mean losing their license to operate, losing a major contract (clearly, one that’s big enough to bring in an auditor!) and in extreme cases bringing down the company.

It doesn’t have to be that way. As a security analyst, my goal isn’t to find problems. It’s to locate any security gaps that may exist, and where appropriate offer remediation steps.

Aren’t those the same thing, though?

Well, no. As the old saying goes, “seek and ye shall find.” I’ve met many auditors who took delight in writing overwhelmingly negative, scathing reports. They’d pounce on any excuse to fail a control. Which sounds like they’d at least be informative, but realistically the resultant reports aren’t all that useful – they don’t give much true concept of the security posture of an organization, because they’re invariably negative.

The problem is that nobody is really looking at the true purpose of security audits and assessments. Organizations being audited just want to get through the audit. The auditors are trying to “catch” the organization. But security audits aren’t high school tests or witch hunts. The end goal isn’t the report. The end goal is an organization, system, or project with a good security posture and no known gaps.

That’s what made the assessment I did last week so unusual. You see, they were given the standards in advance. They knew exactly what I was looking for – and so they went out of their way to make sure I’d find it. They had purpose-built the space specifically to meet the standards. There was no gotcha, no hidden agenda, no posturing or hiding. I knew they’d set things up to make sure my assessment would be good – and that’s great. It’s the way it should be, and the result was a completely clean assessment.

Of course, there is a risk. Organizations may know what the standards are and then try to pretend to follow the standard, or look for loopholes. That’s where the auditor really comes into play – to recognize when an organization is trying to follow the letter but not the spirit of the standard. But the most important thing to remember, for both the auditor and the auditee, is that the goal ultimately, is security – it’s not to play gotcha, it’s not to hide gaps. It’s to find and close the gaps that exist.

This entry was posted on Thursday, March 22nd, 2012 at 3:07 pm by Benjamin Hartley and is filed under risk management, standards, Uncategorized.