Wednesday, January 19, 2011

Information Security Lessons from the Deepwater Horizon Report

The Economist, a weekly newspaper, ran an article on the Deepwater Horizon report issued last week by the President's commission that prompted my reflection on our approach to information security.
Such self-policing should be an adjunct, not an alternative, to better regulation. The commission wants the regulatory reforms already put into place to be beefed up with the creation of a new, fully independent safety regulator. And it wants that regulator to take a new approach. After the loss in 1980 of the Alexander Kielland, a Norwegian rig, and the explosion in 1988 of Britain’s Piper Alpha platform, which between them claimed almost 300 lives, regulators put a new responsibility on operating companies to go beyond meeting existing standards and demonstrate that in the round their plans had minimised all the relevant risks: to make a positive “safety case” for their proposal.
The commission is keen on this kind of approach, which automatically keeps up with the ever more extreme technology being used—deepwater and ultradeepwater drilling has developed at staggering speed—in a way that setting standards in advance cannot. Such safety cases, it notes, should include well-developed plans for what to do if things go wrong, plans that were signally lacking in the case of Deepwater Horizon.
Have you ever read an article about the state of information security that didn't refer to an arms race with hackers or the challenge of maintaining security with the flood of new technologies? Policies, standards, and precisely-defined IT controls clearly have their places, but I like the emphasis on security planning and demonstrating that relevant risks have been minimized. When policies and standards are too thin or too narrow, effective governance is very difficult. Their failures when taken too far in the other extreme are obvious. But where is the sweet spot in the middle?

How many IT projects have you seen where management took 100% of their security requirements straight from policies and standards without appearing to give a moment of consideration to the risks unique to the particular systems, environment, and business use? When such detailed policies and standards didn't exist, how many project managers complained loudly that the security review function was getting in the way and not helping?

My preference has always been for guidelines over standards for practical reasons. The biggest difference is that the administrative expense of tracking known exceptions to the standards rarely seems worth the benefit they provide. Guidelines can fall behind and progress can continue, albeit at the cost of reduced security. When standards fall behind they interfere with progress and count against the trust that the security function has with management. More over, standards are almost guaranteed to be somewhat outdated by the time they are socialized, approved, published, and adopted.

From detailed configuration guidelines to reusable design patterns, we need to enable strong security architectures by providing examples of what good security can look like. However, compliance is not security and we have to make sure that our communications always reinforce the idea that management must think first in terms of risk and where formal guidance has not yet caught up to new technology, it is management's responsibility to demonstrate to their auditors, accreditors, and security managers that their planned use of it is safe.

Tuesday, July 13, 2010

Promise Theory and Blog Update

A new essay on the infrastructure topic that won the poll is coming soon.  In the meantime, I wanted to share this interesting idea: Promise Theory.  I came across it while reading Mark Burgess's USENIX presentation on Cfengine, an open-source configuration management system.

Mark Burgess developed the idea:
"The Promise theory describes policy governed services, in a framework of completely autonomous agents, which assist one another by voluntary cooperation alone."

Thursday, May 27, 2010

Essay: Architecture and the Supply of Talent


Undervaluation of Writing

I almost agreed with everything Ben Tomhave stated about the Undervaluation of Writing (skills for information security professionals).  For context, a for-profit technical school removed the essay writing requirements that he contributed to the curriculum.  He closed with the following:
Being able to effectively communicate ideas in writing is vitally important, not just to infosec professionals, but to humanity as a whole. Programs that undervalue and deemphasize these types of skills are dangerous in that they output students who are simply not adequately prepared for the modern work environment.
As I read the post I was conscious of the economic forces conspiring against Mr. Tomhave, but my sympathy with his position did not turn until I finished his closing statements.  Instead of seeing the undervaluation of writing as a risk and waste, I now see an opportunity to refactor. 

Technical Talent Scarcity

The current practice of information security management is still as much art as science.  An organization's culture and internal politics matter as much or more to the security bottom line than the technical know-how of its security team.  We need exceptionally talented security professionals who possess both the soft and technical skills required to manage these organizational politics and influence positive security outcomes.  Most technical schools do not produce such talent; most universities don't either.  To understand why, we need to consider the market and look at the economic drivers.

Supply and Demand

Demand for qualified security professionals has exceeded supply for the last decade.  If security professionals only required the soft skills, then we would have simply recruited bright salesmen and underemployed English majors and the demand would have been satisfied.  But a used care salesman isn't going to know how to secure an enterprise even if she manages to convince the CEO to prioritize security spending.  Instead, the talent shortage has been predominately about the lack of security-specific training and education.  In response, a number of boutique security training organizations were the first to fill the void.  Universities began developing curriculum as well, but these programs take a while to get going and even longer to produce their first graduates.  This is the exact kind of opportunity for which the for-profit technical schools look.

As the only profit-seeking entities responsible to public shareholders in a crowding market, these technical schools are motivated to produce graduates quickly and inexpensively.  This means that they teach the required security skills and nothing else.  If the soft skills really are required for these graduates to be successful, the technical schools have three options.  They can (1) increase the admission criteria, (2) build them into their curricula, or (3) let the students pursue them on their own.  Primarily competing with community colleges for their students, the first option would increase recruiting costs and the disrupted supply would reduce output.  The second option, to teach the skills, would result in decreased production capacity and increased incremental production costs.  The third option has no adverse impact on profit, which makes it the rational choice for the for-profit technical schools.

Quality and Skill

 From one perspective, the output of the technical schools are of low-quality.  I use the term quality in the sense that the output of these schools are low-quality versions of the highly-skilled information security professionals that are in demand.  And when it comes to the mainstream institutions of higher education, high-quality and high-skill-capable is the most we can expect out of their graduates.  The only significant source of talent that is of both high quality and high skill are those individuals that are already practicing and able to evidence their competence through the strength of their achievements (this last bit about evaluating competence was mentioned in my previous post and is going to be handled in a future essay).  We have to figure out how to get more value from the existing supply -- no other source will provide enough.

Refactoring

Where then, might you ask, is the value of training people to use vulnerability scanning tools when they are not even able to communicate above a high school level?  If we refactor our operational processes and security operations centers in such a way as to create a first level role that can be successfully performed by the graduates from these technical schools, then that can free up our top talent to spend more time on our most important issues.  And with more precise job descriptions (requirements), the technical schools should be able to provide us with high-quality, low-skilled workers.  To be clear, the vast majority of these individuals are not our future high-quality, high-skilled security thought leaders -- the measures of quality for a low-skilled job are not the same as those for a high-skilled job.  Important quality measures at this level would be more attention to detail and a clean background check than they would presentation skills and creativity.

In a future post I want to explore this idea further.  Consider the evolution of the PC support technician over the last twenty years and you can see that a similar transition, enabled by ITIL processed and smarter operating systems and hardware, has already occurred.  What do we spend time on today that could be performed well enough by someone less qualified if they had the right processes and newer technology?

Tuesday, May 18, 2010

Are You a Security Architect?

Next week SANS is hosting its first ever Security Architecture Summit.  According to the NewsBites entry, most people with the title aren't qualified.
...only about 30% of the people holding those titles have substantial security architecture or engineering knowledge. The rest do not know the key questions that seasoned security architects and engineers ask, they cannot do quick and reliable risk assessments, they do not have models of successful designs nor do they have the examples of failures nor the rest of the body of knowledge that defines an engineer or architect.
The summit is being organized by security leaders from SANS, Cisco, NSA, Eli Lilly, US Department of State, and Black Hills Information Security.  With such a limited composition I am curious to see the approach they take and the scope.  Though they use the term Enterprise Security Architecture three times in the summit overview, the very first panel debate is to discuss what exactly is meant by it.  But that isn't stopping an effort to create a new credential:
The bar for holding those titles is now rising. A consortium of organizations where security architecture matters (you can guess which ones they are) is meeting the last week in May to provide a foundation for the missing body of knowledge and to begin the national consensus building project that will lead to a trusted designation as a security engineer or architect.
Will it be a new certification?  A degree?  Is SABSA participating?