Vavada - это онлайн-казино, предоставляющее широкий выбор азартных игр, включая слоты, рулетку, блэкджек и другие. Vavada привлекает игроков разнообразными бонусами и акциями.

A few years ago, a friend of mine served in Afghanistan. It was, as he described it, a long and mostly dull duty. When not busy with soldierly duties, he wrote on his blog and took pictures, often of the rather picturesque – to those who didn’t have to traverse it – scenery. At one point, however, he was informed that these landscape pictures were, in fact, an operational security violation. Not the ones taken in-camp, but the gorgeous panoramas of Afghani mountains and valleys. The theory was that, using those pictures, insurgents could find their position. My friend’s response was succinct: “I think they already know about the mountains, sir.”

In a previous job, I was charged with creating the security documentation for a particular government system, including the disaster recovery plan. That plan necessarily had to include the power requirements for the system. However, with a certain amount of digging, I discovered that by the standards to which I would be held, the simple fact that the servers used either 110V or 220V power was considered “secure unclassified information” and my report would require rather cumbersome treatment. Mind, what put it over the top was not that the servers required 110V, or that the servers required 220V, but simply that the servers might require one or the other. Or, in other words, that the servers required electricity in the same fashion as every other standard server. The bleedingly, patently, absurdly obvious. But that fact was somehow important for security.

There is a certain tendency, with respect to security, to classify, render confidential, or otherwise obscure every piece of information. I cannot count how many times I have heard “we can’t tell you what kind of encryption we use – that would make it insecure!” or some other variant. Indeed, there is a certain value to hiding some seemingly obvious pieces of information – the number of servers, the ports being used, the location of a datacenter in a building. These are not without purpose. There is no sense in making an intruder’s job any easier, and great value in making it as trudgingly difficult and annoying for them as possible.

But this must be tempered with a modicum of sense. In risk assessment terms, this means examining a piece of information and determining what level of risk it exposes. There is no sense in restricting the fact that servers run off of electricity; an intruder knows that – it’s not something that takes much knowledge to figure out. There’s no sense in hiding the fact that a base which is in contact with the local population can see the mountains – the insurgents know that. These are obvious things.

And there’s an important psychological component there. By trying to secure patently obvious things, security by obscurity (already a bad idea) becomes security of absurdity. The very concept of security becomes eroded. Yes, it’s easier to treat all information as secure, but the end users won’t view it that way. What they’ll see – correctly – is a security posture which has gone amok and which is not connected to the reality of their work. And they’ll start ignoring it because it’s ridiculous. And then they’ll be ignoring actually sensible security; they’ve lost confidence in the directives and the purpose behind them. And then you have a problem.

The point is to maintain a real connection with the people who have to implement security directives. As I’ve said before, their job is not to keep your infrastructure secure – their job is, well, their job. To keep people following secure processes, they have to be invested. They have to be able to understand why they’re doing these things. You have to acknowledge that they know the mountains are there, and work within that reality.

2 thoughts on ““I think they already know about the mountains, sir.”

  1. Carl C. says:

    Um… Not such a good example, the mountains. Yes, they know about the mountains, and that’s what makes pictures of them dangerous.

    By taking several pictures shot from a relatively small location, correlating geographical features (such as mountain tops) shown in the picture with the known locations of those landmarks from a map, it is possible to triangulate the approximate position from which the pictures were taken.

    In contrast, pictures taken inside the camp that don’t include geographic reference points are only useful once an attacker has already located and infiltrated the camp.

    I agree that “the server requires either 110V or 220V power” can be said of almost every piece of AC-powered electronic hardware on the planet, and that the type of encryption used shouldn’t be particularly sensitive (unless it’s one that’s known to be vulnerable). But there are legitimate reasons why some things that seem like they shouldn’t be sensitive really need to be treated as if they are.

    The real problem is with communicating WHY that information is sensitive: If your friend had been told HOW his pictures of the mountains could be used to determine his unit’s position, he might have understood why they were a threat and held on to them until the camp had moved.

    Keeping a company’s infrastructure secure IS everyone’s job (you don’t want them propping open the doors all night, do you?). The problem is that people aren’t told how the things they’re being asked to do improve security. Without that understanding, they’re more apt to just ignore them, thinking “Seriously, this can’t be that important.”

  2. Benjamin Hartley says:

    Well, in my friends case, while their location wasn’t a FOB, it wasn’t exactly a secret either – they were in regular communication with the locals, and had semi-regular engagements with insurgents. The notion that his pictures somehow reduced operational security doesn’t hold too well; you’ve got the reason, but taken in context it’s not a very good reason.

    While in many environments it’s true that the listed job functions include “maintain security”, unless an individual is part of the actual security staff, that’s simply tacked on to their actual job functions – the work which requires their presence in the office. From a security perspective, having a door propped open is an obvious problem. From a productivity standpoint, propping open a back door in order to haul in a carload of office supplies can save quite a bit of time compared to carrying the boxes around a building. The security staff will choose to go to the front door and swipe in for each trip to the car; the security staff isn’t accountable for the extra time it took to unload.

    I think you’ve got the most important point, though – making sure that an explanation exists AND is communicated to those who are affected by the policy. Because they will indeed see it as “Seriously, this can’t be that important.” Beyond that, you need actual buy-in – not simply directives to follow policy, but a conviction on the part of the employee that it truly IS more important to maintain solid security than to make things more convenient (and more productive!). That’s not an easy step, of course.

Comments are closed.