I'm faced with a travel ban. The reason this ban exists is to reduce costs. Very laudable, but I know no one who travels on business on a whim. They travel to meet clients and colleagues. They are professional, trusted people.
To travel in this climate of austerity, I have to get approval from very, very senior people. I've just done some cost calculations. If the average cost of the people involved in getting the approval is about £150,000 per annum, I reckon they cost about £90 an hour (220 working days per year and a 7.5 hour day).
Most trips I do cost about £45. If it takes a cumulative 30 minutes to approve the deal (easily achieved), the cost is doubled.
There are other factors. I'm now not trusted to run my own small travel budget. I can't react to a sudden client request for a meeting as approval always takes twice the time you think it will. Therefore the quality of service I provide is reduced accordingly, and I'm a bit grumpy about not being trusted.
The result? Reduced service, reduced morale and double the cost. Well done....
Security is neither art nor science - it's alchemy...
Friday 29 November 2013
Sunday 18 August 2013
Swatting the angry wasps
If you've ever watched a group of 6 year old kids playing football (or soccer if you're North American), you will understand what I mean straight away. They all crowd around the ball, screaming incoherently, all trying to play. Frankly, they're rubbish. They are like a swarm of angry wasps, except that they're slightly less intelligent.
Sometimes you will see, standing off at the side, a clever one. They are 'in space' as the football pundit parlance goes. Give them the ball, and they'll have time to do something constructive. They don't get the ball often, because the angry wasps are keeping it to themselves.
This is the perfect analogy for many organisations. So often, a new initiative is launched, and all the dimwits crowd round it, screaming incoherently. They kick, shove, scream - and are normally ineffective. Sometimes, you have to get yourselves into 'space'. In football parlance, what you need is someone who can 'put their foot on the ball'. These are guys who stop the madness (normally only for short periods), look around, and then pass the ball to someone in space. The greatest footballers are normally those who can put their foot on the ball, look around, and then do something constructive with it.
We security types often behave like 6 year old children. We follow the latest edict or trend, crowd round the ball and scream. It's not just technical issues that get treated like this. It could be something like sales. The cry goes up - 'We need to improve our pipeline' - and we all drop everything. This is where the problem starts. We can't sell without products, and so often, all our effort goes into sales. We can't get better products or services without helping our people improve. Training and education is so often the first item to be struck off the budget - especially when 'we need to improve our pipeline'. We have to deliver our Business As Usual (BAU ) services. This can be forgotten. If you're BAU is rubbish, people find out about it pretty quickly. Then they go and tell other people. This makes your sales effort harder.
In the short term, by concentrating solely on sales, we may have successes - improving the pipeline. We will not have improved our products our people. You cannot sustain sales without improving your people and products. The three pillars of BAU delivery, service/people (or product) improvement and increasing your revenue cannot exist in isolation. They are totally symbiotic. There are other such magic triangles, such as Good, Quick and Cheap. You can only get two of the three at any one time. If it's cheap and quick, it won't be any good. If it's good and quick, it won't be cheap.
In my triangle, you need all three. If you fail to manage one element, in time, you will fail. Remember, sometimes you should put your foot on the ball. My favourite footballers were such men. Graeme Souness was famous for taking the time to stop, look up, and then do something devastating with the ball. The late, great Billy Bremner was another of the same type. What is also notable about the two of them is the fact that they were amongst the most aggressive, no-nonsense players you could imagine. Bremner in particular took no prisoners.
Don't act like a 6 year old kid. Act like a midfield maestro but remember to deliver the occasional kicking when required.
Sometimes you will see, standing off at the side, a clever one. They are 'in space' as the football pundit parlance goes. Give them the ball, and they'll have time to do something constructive. They don't get the ball often, because the angry wasps are keeping it to themselves.
This is the perfect analogy for many organisations. So often, a new initiative is launched, and all the dimwits crowd round it, screaming incoherently. They kick, shove, scream - and are normally ineffective. Sometimes, you have to get yourselves into 'space'. In football parlance, what you need is someone who can 'put their foot on the ball'. These are guys who stop the madness (normally only for short periods), look around, and then pass the ball to someone in space. The greatest footballers are normally those who can put their foot on the ball, look around, and then do something constructive with it.
We security types often behave like 6 year old children. We follow the latest edict or trend, crowd round the ball and scream. It's not just technical issues that get treated like this. It could be something like sales. The cry goes up - 'We need to improve our pipeline' - and we all drop everything. This is where the problem starts. We can't sell without products, and so often, all our effort goes into sales. We can't get better products or services without helping our people improve. Training and education is so often the first item to be struck off the budget - especially when 'we need to improve our pipeline'. We have to deliver our Business As Usual (BAU ) services. This can be forgotten. If you're BAU is rubbish, people find out about it pretty quickly. Then they go and tell other people. This makes your sales effort harder.
In the short term, by concentrating solely on sales, we may have successes - improving the pipeline. We will not have improved our products our people. You cannot sustain sales without improving your people and products. The three pillars of BAU delivery, service/people (or product) improvement and increasing your revenue cannot exist in isolation. They are totally symbiotic. There are other such magic triangles, such as Good, Quick and Cheap. You can only get two of the three at any one time. If it's cheap and quick, it won't be any good. If it's good and quick, it won't be cheap.
In my triangle, you need all three. If you fail to manage one element, in time, you will fail. Remember, sometimes you should put your foot on the ball. My favourite footballers were such men. Graeme Souness was famous for taking the time to stop, look up, and then do something devastating with the ball. The late, great Billy Bremner was another of the same type. What is also notable about the two of them is the fact that they were amongst the most aggressive, no-nonsense players you could imagine. Bremner in particular took no prisoners.
Don't act like a 6 year old kid. Act like a midfield maestro but remember to deliver the occasional kicking when required.
Friday 9 August 2013
Whatʼs in a word? Cyber Security and reality
There are a number of terms currently being used by security practitioners that really annoy me. Threat vector. Threat landscape. The worst is Cybersecurity. What a wonderful word. Itʼs real beauty is that it means whatever you want it to. It is now shortened to ʻcyberʼ- and is used and misused across the word by serious professionals, semi-literate journalists, snake-oil merchants and associated charlatans alike.
Having said this, it has undoubtedly grabbed a lot of attention. Where IT Security and Information Security failed (pretty spectacularly to be honest) - Cybersecurity has flourished. Board members are concerned about ʻcyberʼ. Governments run scared of ʻcyberterroristsʼ. ʻCybercriminalsʼ wait everywhere, desperate to desecrate ʻthe Gridʼ - the basic utilities we all think we need to survive.
Is this real, or is this hype? What has actually changed? The answer is simple. In terms of the basic threats we face, nothing has changed. In terms of risk, the picture is very different.
Letʼs start with a bit of deconstruction. What does ʻcyberʼ mean? The root of the word has become obscured. The term κυβερνήτης (cybernetic) is based on an ancient Greek word that suggests someone is ʻexpert in directionʼ - a steersman or pilot. It can also mean ʻrudderʼ It suggests remote control. It did not mean security. It does now.
Now I know that words evolve constantly. You canʼt decide that once defined, a word stays the same in meaning ad infinitum. The word ʻjargonʼ used to mean the “chattering of birds” (from the Old French gargun). It doesnʼt mean that now, even though the ancient definition can be applied to many of the self-aggrandizing security talking heads I sometimes have to deal with. Perhaps this is why my initial deep annoyance with the term ʻcyberʼ is beginning to mellow. It may be a total aberration of the original term, but it has generated something - that being a growing awareness of the risks we all face. It may not be down to the word itself, but its increased use coincides with a real change in the way information risks are perceived.
Western societyʼs reliance on the Internet and dependency on connected systems to manage power, water, traffic, financial services, mineral prospecting, food transport logistics, medical procedures, emergency response services etc etc - make us very vulnerable. Very vulnerable indeed. This is the stuff that grabs attention. Not ʻphishingʼ attacks on individuals to gain banking system logon details. Such things are, on a global scale, an irritation. Nor the defacing of websites. Such defacements normally reflect highly emotional social issues (gay marriage, women clergy, animal rights, privacy matters and so forth) rather than life-threatening circumstances. The real deal is ʻlife and limbʼ, and we have now reached a situation wherein truly critical systems are exposed to remote attack.
This situation is exacerbated by the manner in which these systems are closely connected. Disruption to electric power supplies will disrupt most other systems. Compromising the water supply affects everyone - deeply. Transport, logistics and food supply are closely interlinked. In many western countries, a light dusting of snow can cause basic systems, such as the railway network, to grind to a halt. Itʼs not difficult to extrapolate this and understand how a targeted attack on utilities could cause significant collateral damage.There are a number of subplots to this. We should, for all sorts of reasons, look to use local resources rather than having them brought in from distance. Shortened supply chains tend to be more resilient, and easier to repair that lengthy ones. They also generate less carbon. But we donʼt always look to local solutions, and the lack of true operational resilience in western societies will cause real problems if they are not addressed.
Which brings me back to ʻcyberʼ. My deep annoyance at the term is not an isolated instance. Iʼve spoken to lots of other people (from all walks of life) who feel the same. Jargon and hyped terminology is often, for good reason, treated with skepticism and disdain. This is happening to the word ʻcyberʼ and will continue to happen unless we intercede. It is essential that we ensure people understand the true scale of the risks we face. People donʼt look under the bonnet (or under the hood if youʼre American) except when things go wrong. A glimpse under the bonnet of our interconnected society suggests to me that we need to make sure it is capable of withstanding calculated, targeted, malevolent attacks. A scattergun approach when discussing such risks will reduce the overall effectiveness of our communications. We need to keep the snake-oil merchants at bay whilst passing on our message.
So - what is our message? What is ʻcyberʼ? Is it ʻIT Securityʼ? Does ʻcyberʼ enhance or replace ʻInformation Securityʼ? Is there an alternative? Suggestions on this are very welcome! We are at a juncture that will, if we manage things well, help set up resilient systems across society. The longer we allow things to drift, and let the charlatans muddy the waters, the less capable our society will be to manage systemic failures when they happen.
Thursday 16 August 2012
Badges of dishonour
Everyone likes a badge - especially if it's a badge of honour. Many of us brandish the CISSP badge. I don't - perhaps I should. My reasons for not doing so are many, but all centre around the simple fact that many people I know who wear this particular badge are, in intellectual terms, on a par with single-cell animals. I could say the same for the CISM qualification - there are individuals who hold this qualification who I wouldn't trust to dress themselves correctly.
I've always regarded any qualification you can scrape through via a week-long 'boot-camp' course to be suspect. Boot camps boost short term memory - what's learnt in this manner normally fades quickly. Even with refresher courses, I think these and similar qualifications lend themselves to one simple capability - a decent memory. I'm very aware that many (indeed most) holders of these badges are upright, solid and reliable professionals. The badge is not, in my opinion, proof of that - it's what these people do and the changes they manage that are important. Give me experience and proven competence over a badge like these anytime.
How then do you test if someone is competent without spending some length of time working with them? The answer is not simple. Testing competence cannot be done via a multiple choice tickbox. It can only come via the thorough examination of evidence, and asking the person claiming competence some direct and tricky questions. The problem is, the person asking the questions, and judging the responses, has to be an expert - someone who is him or herself time-served and competent.
I've always been a keen student on initiatives to 'professionalise the profession', mostly because they are a source of deep amusement to me. However, the UK Government is now seeking to certify Information Assurance specialists using a number of Certification Bodies, or CBs. The CBs are the APM Group, the British Computer Society and the Institute of Information Security Professionals.
The APM Group is the first CB to go live (June 2012) having satisfied CESG (the UK Government body that deals with Information Assurance matters) that their assessment process is appropriate. What pleases me about their approach is that they use experts who themselves are certified. They have an assessment process that includes review of CVs, review of an Evidence Form that draws out experience and capability - backed up by interviews that test the evidence. You can't get that from a tickbox.
It looks as if the UK Government will, at some point, demand that many people involved in government Information Assurance get certified or be denied the chance to practice. If the certification is as rigorous as seems to be the case of the APM Group approach, we might find ourselves having a model process for professional assurance of ourselves. I'd sooner deal with someone who's been proved to be competent over someone who's proved he can remember a list of words.
I've always regarded any qualification you can scrape through via a week-long 'boot-camp' course to be suspect. Boot camps boost short term memory - what's learnt in this manner normally fades quickly. Even with refresher courses, I think these and similar qualifications lend themselves to one simple capability - a decent memory. I'm very aware that many (indeed most) holders of these badges are upright, solid and reliable professionals. The badge is not, in my opinion, proof of that - it's what these people do and the changes they manage that are important. Give me experience and proven competence over a badge like these anytime.
How then do you test if someone is competent without spending some length of time working with them? The answer is not simple. Testing competence cannot be done via a multiple choice tickbox. It can only come via the thorough examination of evidence, and asking the person claiming competence some direct and tricky questions. The problem is, the person asking the questions, and judging the responses, has to be an expert - someone who is him or herself time-served and competent.
I've always been a keen student on initiatives to 'professionalise the profession', mostly because they are a source of deep amusement to me. However, the UK Government is now seeking to certify Information Assurance specialists using a number of Certification Bodies, or CBs. The CBs are the APM Group, the British Computer Society and the Institute of Information Security Professionals.
The APM Group is the first CB to go live (June 2012) having satisfied CESG (the UK Government body that deals with Information Assurance matters) that their assessment process is appropriate. What pleases me about their approach is that they use experts who themselves are certified. They have an assessment process that includes review of CVs, review of an Evidence Form that draws out experience and capability - backed up by interviews that test the evidence. You can't get that from a tickbox.
It looks as if the UK Government will, at some point, demand that many people involved in government Information Assurance get certified or be denied the chance to practice. If the certification is as rigorous as seems to be the case of the APM Group approach, we might find ourselves having a model process for professional assurance of ourselves. I'd sooner deal with someone who's been proved to be competent over someone who's proved he can remember a list of words.
Friday 1 June 2012
Not at the flick of a switch
Complex systems don't behave in simple ways. You can't flick a switch and see an immediate, predictable, mechanical response. So often you hear someone complaining about a cold spell during the summer, and make a fatuous statement such as 'well, what's all this global warming about then? Why is it cold and rainy'? The answer is simple, the fatuous idiot is failing to understand the difference between climate and weather.
In information security terms, weather is similar to an 'awareness programme'. Global warming is 'security culture change'.
You could look at many long term trends, and then select a section of it that runs counter to the overall trend. This seems to contradict the long term. So it goes for climate change. If all you remember is a couple of cold summers, then you may well denigrate the current climate change theories. Given the mass of evidence, you would be wrong.
That's the trouble with people - they have only a short term memory. Corporations have a memory span - it degrades to almost zero after about 20 years, when all the wise old heads who remember to disaster, terrible event, or remarkable success, have retired. In terms of climate change, 20 years is a blip. In terms of security culture, it just about significant. It takes years to build a culture - it will take years to change it.
Climate change has another feature - it can contain 'tipping points'. These are instances in time when the game changes - it 'tips' a system into a new paradigm. For example, it may be the initiation of a new, self-sustaining trend, such as when sea temperatures rise to a point when it can no longer hold CO2, which causes a release, that in turn increases temperatures. I think that security culture change may contain similar characteristics. If we can spot the tipping points that can work to our advantage, we need to work towards them.
However, complex systems need a broad approach to change. And we have to realise that there are some things we can change, and many we can't. Climate change provides a further example. It has many driving factors, such as:
Even though we humans are pretty handy, there's very little we can do about these. In security terms, there's not a lot we can do about, for example:
What we can do needs to be identified. In climate terms, we can limit our output of CO2 via many routes (better public transport, more efficient cars, insulated homes, zero-emission power generation and so forth). In security terms, we need to do similar. The first thing we should do is the same as has been done by the climate change lobby. We need to convince people that changing the way we behave in regard to security is essential. We're some of the way down the track here, but to really make a difference, and reach a true tipping-point, this will take time, but we will suffer constant knock-backs. People will think we're barriers, obstructive and unhelpful. We need to ensure people buy-in through persuasion, example and solid evidence. We need the equivalent of eco-warriors, articulate scientists, presentable front-men (and women). We need facts - not horror stories. We need to do it constantly - again and again. Incidents won't go away, but the long-term trends should go down. If they don't, I'm wrong. I don't think I am.
- Weather is tactical.
- Climate is strategic.
In information security terms, weather is similar to an 'awareness programme'. Global warming is 'security culture change'.
You could look at many long term trends, and then select a section of it that runs counter to the overall trend. This seems to contradict the long term. So it goes for climate change. If all you remember is a couple of cold summers, then you may well denigrate the current climate change theories. Given the mass of evidence, you would be wrong.
That's the trouble with people - they have only a short term memory. Corporations have a memory span - it degrades to almost zero after about 20 years, when all the wise old heads who remember to disaster, terrible event, or remarkable success, have retired. In terms of climate change, 20 years is a blip. In terms of security culture, it just about significant. It takes years to build a culture - it will take years to change it.
Climate change has another feature - it can contain 'tipping points'. These are instances in time when the game changes - it 'tips' a system into a new paradigm. For example, it may be the initiation of a new, self-sustaining trend, such as when sea temperatures rise to a point when it can no longer hold CO2, which causes a release, that in turn increases temperatures. I think that security culture change may contain similar characteristics. If we can spot the tipping points that can work to our advantage, we need to work towards them.
However, complex systems need a broad approach to change. And we have to realise that there are some things we can change, and many we can't. Climate change provides a further example. It has many driving factors, such as:
- Changes in the sun's output
- Changes in the Earth's orbit
- Natural disasters (volcanoes and so forth) that change the amount of solar radiation received
- Cyclical changes in the oceans (El Nino for example)
Even though we humans are pretty handy, there's very little we can do about these. In security terms, there's not a lot we can do about, for example:
- The global economy
- Wars
- The ambitions of nation states
- The ability of organised crime
What we can do needs to be identified. In climate terms, we can limit our output of CO2 via many routes (better public transport, more efficient cars, insulated homes, zero-emission power generation and so forth). In security terms, we need to do similar. The first thing we should do is the same as has been done by the climate change lobby. We need to convince people that changing the way we behave in regard to security is essential. We're some of the way down the track here, but to really make a difference, and reach a true tipping-point, this will take time, but we will suffer constant knock-backs. People will think we're barriers, obstructive and unhelpful. We need to ensure people buy-in through persuasion, example and solid evidence. We need the equivalent of eco-warriors, articulate scientists, presentable front-men (and women). We need facts - not horror stories. We need to do it constantly - again and again. Incidents won't go away, but the long-term trends should go down. If they don't, I'm wrong. I don't think I am.
Tuesday 29 May 2012
Security Cultivation - a primer
The June 2008 Hannigan Report on Data Handling Procedures in UK Government required a number of actions. One that stands out for me states that “Departments should put in place plans to lead and foster a culture that values, protects and uses information for the public good, and monitor progress, as a minimum through standardised Civil Service-wide questions in their people surveys”. What is very apparent is that little has been done to address this fundamental issue.
The development of a security ‘culture’ intrigues me. There are a lot of people (mainly academics and consultants) seeking to attain the intellectual high-ground in this area. This intellectual understanding (if it is indeed so) has not been translated into reality in many parts of government.
One of the prescribed roles in government is that of the Senior Information Risk Owner, or SIRO. Each SIRO is tasked with four main deliverables are follows:
- An Information Risk Policy
- An Information Risk Assessment
- Advice on the Statement of Internal Controls
- A Cultural Change Plan
The first three strike me as being straightforward. The final one is not. It causes me to ask the following questions.
What is ‘culture’? Why change it? How do you measure it so you know when to change it, and by how much?
So, it’s Wikipedia to the rescue with their definition(s) of culture:
- Excellence of taste in the fine arts and humanities, also known as high culture
- An integrated pattern of human knowledge, belief, and behaviour that depends upon the capacity for symbolic thought and social learning
- The set of shared attitudes, values, goals, and practices that characterises an institution, organisation or group
I reckon we’re looking at something that lurks between definitions 2 and 3. All the literature and studies I can find suggest that imposing culture does not work. Culture is not a ‘thing’ in itself - it is the result of many things happening at lower levels within an organisation. To change culture, you need to change the way people interact with each other. What is also a common thread in the literature is the use of terms I can only describe as ‘horticultural’. Examples include the already mentioned ‘nurture’, ‘foster’ and ‘cultivate’. It’s perhaps no coincidence that the Latin root of the word ‘culture’ is ‘cultura’, which in itself stems from the word ‘colere’ - to cultivate.
This leads on to the inevitable development of a series of horticultural metaphors relating to culture. A gardener seeks to develop an environment wherein things he wants to grow actually do grow. He seeks to discourage or prevent things that he doesn’t want to grow from growing. He wants to keep pests out, to stop them destroying the things he wants to grow. He is trying to provide the right conditions for his plants to do their stuff. He can’t do their stuff for them.
Given that cultural imposition is ineffective (history has too may examples of attempted cultural suppression that leads to fierce resistance and failure), if we want to change our organisational culture to one that has characteristics we want, we have to provide the right conditions. We can (to some degree) secure ourselves from pests - an anti-bird net is a fine metaphor for a firewall, as is a slug pellet. Providing safe conditions can be equated to providing feed (compost and minerals for example).
I think that the prime ingredient for a sound security culture is the example set by senior managers. This can probably fall into a ‘nurturing’ metaphor, but I know most metaphors fail to withstand close scrutiny and analysis, so I’m not taking it too far! This aside, the concept remains sound. Without the big players walking the talk, you will probably fail. People hate change, and if they see their seniors not doing what they themselves say people should do, they have the best reason for not doing it as well.
If you want to develop a successful security culture, you need to ensure the top brass act appropriately. They need to know how they should behave. You have to identify those behaviours you consider most appropriate to the security culture you want, and then encourage people to behave that way.
This issue is often made more difficult because managing a cultural change initiative goes beyond the normal bounds associated with information security management. You need to integrate with your HR function, your corporate governance bodies, your trades unions (if you have them) and many others besides. You are also asking people to change, which is one of the hardest things anyone can attempt.
There are some simple tips that make this a little easier. You need to understand what it is you want. You need to articulate this understanding clearly so that other people understand what you want. You need to communicate your understanding clearly, and try, wherever possible, to demonstrate that the change you are asking for brings benefits to those affected by them. You should also ensure a degree of continuity in the change process - if there are elements that are familiar in the ‘new’, it is likely to be more readily accepted - unfettered radical change that misses this trick is very hard to accept - mainly because it will feel like imposition, and we know that rarely works.This issue is going to grow and grow. Start cultivating now.
Subscribe to:
Posts (Atom)