Thursday 16 August 2012

Badges of dishonour

Everyone likes a badge - especially if it's a badge of honour. Many of us brandish the CISSP badge. I don't - perhaps I should. My reasons for not doing so are many, but all centre around the simple fact that many people I know who wear this particular badge are, in intellectual terms, on a par with single-cell animals. I could say the same for the CISM qualification - there are individuals who hold this qualification who I wouldn't trust to dress themselves correctly.

I've always regarded any qualification you can scrape through via a week-long 'boot-camp' course to be suspect. Boot camps boost short term memory - what's learnt in this manner normally fades quickly. Even with refresher courses, I think these and similar qualifications lend themselves to one simple capability - a decent memory. I'm very aware that many (indeed most) holders of these badges are upright, solid and reliable professionals. The badge is not, in my opinion, proof of that - it's what these people do and the changes they manage that are important. Give me experience and proven competence over a badge like these anytime.

How then do you test if someone is competent without spending some length of time working with them?  The answer is not simple. Testing competence cannot be done via a multiple choice tickbox. It can only come via the thorough examination of evidence, and asking the person claiming competence some direct and tricky questions. The problem is, the person asking the questions, and judging the responses, has to be an expert - someone who is him or herself time-served and competent.

I've always been a keen student on initiatives to 'professionalise the profession', mostly because they are a source of deep amusement to me. However, the UK Government is now seeking to certify Information Assurance specialists using a number of Certification Bodies, or CBs. The CBs are the APM Group, the British Computer Society and the Institute of Information Security Professionals.

The APM Group is the first CB to go live (June 2012) having satisfied CESG (the UK Government body that deals with Information Assurance matters) that their assessment process is appropriate. What pleases me about their approach is that they use experts who themselves are certified. They have an assessment process that includes review of CVs, review of an Evidence Form that draws out experience and capability - backed up by interviews that test the evidence. You can't get that from a tickbox.

It looks as if the UK Government will, at some point, demand that many people involved in government Information Assurance get certified or be denied the chance to practice. If the certification is as rigorous as seems to be the case of the APM Group approach, we might find ourselves having a model process for professional assurance of ourselves. I'd sooner deal with someone who's been proved to be competent over someone who's proved he can remember a list of words.

Friday 1 June 2012

Not at the flick of a switch

Complex systems don't behave in simple ways. You can't flick a switch and see an immediate, predictable, mechanical response. So often you hear someone complaining about a cold spell during the summer, and make a fatuous statement such as 'well, what's all this global warming about then? Why is it cold and rainy'? The answer is simple, the fatuous idiot is failing to understand the difference between climate and weather.

  • Weather is tactical.
  • Climate is strategic.

In information security terms, weather is similar to an 'awareness programme'. Global warming is 'security culture change'.

You could look at many long term trends, and then select a section of it that runs counter to the overall trend. This seems to contradict the long term. So it goes for climate change. If all you remember is a couple of cold summers, then you may well denigrate the current climate change theories. Given the mass of evidence, you would be wrong.

That's the trouble with people - they have only a short term memory. Corporations have a memory span - it degrades to almost zero after about 20 years, when all the wise old heads who remember to disaster, terrible event, or remarkable success, have retired. In terms of climate change, 20 years is a blip. In terms of security culture, it just about significant. It takes years to build a culture - it will take years to change it.

Climate change has another feature - it can contain 'tipping points'. These are instances in time when the game changes - it 'tips' a system into a new paradigm. For example, it may be the initiation of a new, self-sustaining trend, such as when sea temperatures rise to a point when it can no longer hold CO2, which causes a release, that in turn increases temperatures. I think that security culture change may contain similar characteristics. If we can spot the tipping points that can work to our advantage, we need to work towards them.

However, complex systems need a broad approach to change. And we have to realise that there are some things we can change, and many we can't. Climate change provides a further example. It has many driving factors, such as:

  • Changes in the sun's output
  • Changes in the Earth's orbit
  • Natural disasters (volcanoes and so forth) that change the amount of solar radiation received
  • Cyclical changes in the oceans (El Nino for example)

Even though we humans are pretty handy, there's very little we can do about these. In security terms, there's not a lot we can do about, for example:

  • The global economy
  • Wars
  • The ambitions of nation states
  • The ability of organised crime

What we can do needs to be identified. In climate terms, we can limit our output of CO2 via many routes (better public transport, more efficient cars, insulated homes, zero-emission power generation and so forth). In security terms, we need to do similar. The first thing we should do is the same as has been done by the climate change lobby. We need to convince people that changing the way we behave in regard to security is essential. We're some of the way down the track here, but to really make a difference, and reach a true tipping-point, this will take time, but we will suffer constant knock-backs. People will think we're barriers, obstructive and unhelpful. We need to ensure people buy-in through persuasion, example and solid evidence. We need the equivalent of eco-warriors, articulate scientists, presentable front-men (and women). We need facts - not horror stories. We need to do it constantly - again and again. Incidents won't go away, but the long-term trends should go down. If they don't, I'm wrong. I don't think I am.

Tuesday 29 May 2012

Security Cultivation - a primer


The June 2008 Hannigan Report on Data Handling Procedures in UK Government required a number of actions. One that stands out for me states that “Departments should put in place plans to lead and foster a culture that values, protects and uses information for the public good, and monitor progress, as a minimum through standardised Civil Service-wide questions in their people surveys”. What is very apparent is that little has been done to address this fundamental issue.

The development of a security ‘culture’ intrigues me. There are a lot of people (mainly academics and consultants) seeking to attain the intellectual high-ground in this area. This intellectual understanding (if it is indeed so) has not been translated into reality in many parts of government.

One of the prescribed roles in government is that of the Senior Information Risk Owner, or SIRO. Each SIRO is tasked with four main deliverables are follows:
  • An Information Risk Policy
  • An Information Risk Assessment
  • Advice on the Statement of Internal Controls
  • A Cultural Change Plan

The first three strike me as being straightforward. The final one is not. It causes me to ask the following questions.

What is ‘culture’? Why change it? How do you measure it so you know when to change it, and by how much?

So, it’s Wikipedia to the rescue with their definition(s) of culture:
  • Excellence of taste in the fine arts and humanities, also known as high culture
  • An integrated pattern of human knowledge, belief, and behaviour that depends upon the capacity for symbolic thought and social learning 
  • The set of shared attitudes, values, goals, and practices that characterises an institution, organisation or group

I reckon we’re looking at something that lurks between definitions 2 and 3. All the literature and studies I can find suggest that imposing culture does not work. Culture is not a ‘thing’ in itself - it is the result of many things happening at lower levels within an organisation. To change culture, you need to change the way people interact with each other. What is also a common thread in the literature is the use of terms I can only describe as ‘horticultural’. Examples include the already mentioned ‘nurture’, ‘foster’ and ‘cultivate’. It’s perhaps no coincidence that the Latin root of the word ‘culture’ is ‘cultura’, which in itself stems from the word ‘colere’ - to cultivate.

This leads on to the inevitable development of a series of horticultural metaphors relating to culture. A gardener seeks to develop an environment wherein things he wants to grow actually do grow. He seeks to discourage or prevent things that he doesn’t want to grow from growing. He wants to keep pests out, to stop them destroying the things he wants to grow. He is trying to provide the right conditions for his plants to do their stuff. He can’t do their stuff for them.

Given that cultural imposition is ineffective (history has too may examples of attempted cultural suppression that leads to fierce resistance and failure), if we want to change our organisational culture to one that has characteristics we want, we have to provide the right conditions. We can (to some degree) secure ourselves from pests - an anti-bird net is a fine metaphor for a firewall, as is a slug pellet. Providing safe conditions can be equated to providing feed (compost and minerals for example).

I think that the prime ingredient for a sound security culture is the example set by senior managers. This can probably fall into a ‘nurturing’ metaphor, but I know most metaphors fail to withstand close scrutiny and analysis, so I’m not taking it too far! This aside, the concept remains sound. Without the big players walking the talk, you will probably fail. People hate change, and if they see their seniors not doing what they themselves say people should do, they have the best reason for not doing it as well.

If you want to develop a successful security culture, you need to ensure the top brass act appropriately. They need to know how they should behave. You have to identify those behaviours you consider most appropriate to the security culture you want, and then encourage people to behave that way.

This issue is often made more difficult because managing a cultural change initiative goes beyond the normal bounds associated with information security management. You need to integrate with your HR function, your corporate governance bodies, your trades unions (if you have them) and many others besides. You are also asking people to change, which is one of the hardest things anyone can attempt.

There are some simple tips that make this a little easier. You need to understand what it is you want. You need to articulate this understanding clearly so that other people understand what you want. You need to communicate your understanding clearly, and try, wherever possible, to demonstrate that the change you are asking for brings benefits to those affected by them. You should also ensure a degree of continuity in the change process - if there are elements that are familiar in the ‘new’, it is likely to be more readily accepted - unfettered radical change that misses this trick is very hard to accept - mainly because it will feel like imposition, and we know that rarely works.This issue is going to grow and grow. Start cultivating now.