I see the overuse of generalized external risk assessments (Top-N “frameworks”) as the result of two issues. First, the entirely poor state of basic cybersecurity hygiene which is a result of cybersecurity and information technology technical debt, the kind of debt which now requires a defensible approach for being only partially paid down. Second, the failure to use risk assessment in the context of business decision making supported communicating technical risks in business risk terms. Not coincidentally, the second issue is the root of the first.
Showing posts with label perspectives. Show all posts
Showing posts with label perspectives. Show all posts
Thursday, November 8, 2018
Top-N List "Frameworks" and Why They Will Fail You
Lately, I’ve noticed a trend of prioritizing security program activities using Top-N security issue lists under the guise of using them as ‘a framework’. While possibly a useful input to decision making, Top-N lists amount to someone else’s risk assessment, one that doesn’t necessarily address your business objectives, operating model, technical environment, and industry specific issues. They are also rarely the type of all-encompassing catalog or taxonomy of considerations that are denoted by “framework.”
I see the overuse of generalized external risk assessments (Top-N “frameworks”) as the result of two issues. First, the entirely poor state of basic cybersecurity hygiene which is a result of cybersecurity and information technology technical debt, the kind of debt which now requires a defensible approach for being only partially paid down. Second, the failure to use risk assessment in the context of business decision making supported communicating technical risks in business risk terms. Not coincidentally, the second issue is the root of the first.
I see the overuse of generalized external risk assessments (Top-N “frameworks”) as the result of two issues. First, the entirely poor state of basic cybersecurity hygiene which is a result of cybersecurity and information technology technical debt, the kind of debt which now requires a defensible approach for being only partially paid down. Second, the failure to use risk assessment in the context of business decision making supported communicating technical risks in business risk terms. Not coincidentally, the second issue is the root of the first.
Wednesday, June 13, 2018
Everything Old is New Again, or Why Firewalls Were Always Supplemental
Those who I have spoken to who entered the security field in the last 15 years have often expressed the notion of cybersecurity as a network-first endeavor: firewalls, DMZs, perimeter controls, and so on. Those that have been in the field for 30 years or more will likely remember that firewalls were created as backstop for host security at a time when host-based security protection from the network was scarcely on the radar of most OS designers.
As the network-first approach continues to show the limits of its usefulness in environments where network boundaries are becoming less definable and where advanced persistent threats are common, it makes sense to advocate for a shift back to the host-centric mindset.
I suspect that this will be widely viewed as cost prohibitive, burdensome, unrealistic, and so on. Most good ideas are characterized this way until effective approaches and efficient methods are developed.
As the network-first approach continues to show the limits of its usefulness in environments where network boundaries are becoming less definable and where advanced persistent threats are common, it makes sense to advocate for a shift back to the host-centric mindset.
I suspect that this will be widely viewed as cost prohibitive, burdensome, unrealistic, and so on. Most good ideas are characterized this way until effective approaches and efficient methods are developed.
Thursday, November 16, 2017
The Trap of Risk Assessment Tools
Humans understand and respond to narrative innately. Impactful events are explained as stories, successful calls to action are delivered as a descriptions of a desired or undesired future, and people make decisions every day through explanations.
We do this because narratives are a natural form for communicating information and insight. Narratives are therefor also a natural form for risk communication. In regards to risk, narratives help listeners visualize and develop internalized models of risk, which in turn represent the truth in a way that numbers alone won't (and can't) for most individuals and most situations.
Much of the current cybersecurity and enterprise risk management world is premised on the idea (or ideal) that risks can be meaningfully summarized in ordinal form, such as “very high likelihood”, “moderate impact.” These english ordinals are then often converted to numbers. Sometimes math is performed, and a magic risk number is derived. Some models even use dollars or dollar ranges in their outputs. However, focusing on getting to the “right number” - and reporting that is focused on numbers - denies the reader the benefit of context. It makes it hard for decision makers to object to the assumptions embodied in the inputs and the calculations. Numbers or dollars alone convey objectiveness and authority, even while that is not necessarily the case. The work of "getting to the numbers" behind closed doors only serves to exacerbate this issue. In the best case systematizes the subtle biases of the assessors, and in the worst case distorts the model to fit preconceived notions about organizational priorities, or conform to personal risk worldviews. Caution should be advised for those using numbers alone to represent risk.
In contrast, risk assessment communication that focuses on narrative and which embraces dialog allows for discoveries and insights by the decision makers, provides an opportunity to question assumptions, and enables sharing and alignment of perspectives. Risk decisions, everyday and tough risk decisions alike, are best borne of discussion, and will be for the foreseeable future.
If you are responsible for risk assessment at your organization, don’t fall pray to a tool’s promise, or the unquestioned illusion of objectivity and certainty that comes from numbers. If you are performing risk assessment where the focus is populating fields in a spreadsheet or application, no matter how advanced it is, you run the risk that the one thing that everyone needs to consider is getting lost through reductionism. Do the numbers if required, but wrap it in an informative discussion, and design and facilitate the discussion to convey risks in business terms that supports business decision making.
We do this because narratives are a natural form for communicating information and insight. Narratives are therefor also a natural form for risk communication. In regards to risk, narratives help listeners visualize and develop internalized models of risk, which in turn represent the truth in a way that numbers alone won't (and can't) for most individuals and most situations.
Much of the current cybersecurity and enterprise risk management world is premised on the idea (or ideal) that risks can be meaningfully summarized in ordinal form, such as “very high likelihood”, “moderate impact.” These english ordinals are then often converted to numbers. Sometimes math is performed, and a magic risk number is derived. Some models even use dollars or dollar ranges in their outputs. However, focusing on getting to the “right number” - and reporting that is focused on numbers - denies the reader the benefit of context. It makes it hard for decision makers to object to the assumptions embodied in the inputs and the calculations. Numbers or dollars alone convey objectiveness and authority, even while that is not necessarily the case. The work of "getting to the numbers" behind closed doors only serves to exacerbate this issue. In the best case systematizes the subtle biases of the assessors, and in the worst case distorts the model to fit preconceived notions about organizational priorities, or conform to personal risk worldviews. Caution should be advised for those using numbers alone to represent risk.
In contrast, risk assessment communication that focuses on narrative and which embraces dialog allows for discoveries and insights by the decision makers, provides an opportunity to question assumptions, and enables sharing and alignment of perspectives. Risk decisions, everyday and tough risk decisions alike, are best borne of discussion, and will be for the foreseeable future.
If you are responsible for risk assessment at your organization, don’t fall pray to a tool’s promise, or the unquestioned illusion of objectivity and certainty that comes from numbers. If you are performing risk assessment where the focus is populating fields in a spreadsheet or application, no matter how advanced it is, you run the risk that the one thing that everyone needs to consider is getting lost through reductionism. Do the numbers if required, but wrap it in an informative discussion, and design and facilitate the discussion to convey risks in business terms that supports business decision making.
Tuesday, October 7, 2014
Risk assessment vs risk analysis, and why it doesn't matter
Over the last decade I have witnessed heated debates about the terms risk assessment and risk analysis. In most cases, the outcome of these debates is not a richer understanding of the risk domain but rather a fruitless exercise in politics and getting (or not getting) along. This got me to thinking about the circumstances under which these and other risk definitions are important and those under which they are not.
On Audience
We could speak the truest sentence ever spoken by using exactly the correct words, but in doing so with a non-native speaker visiting in a foreign land, it would be futile. This may sound absurd if we think of foreign tourists, but I have seen security and risk people do this often enough with non-practitioners to cringe. It’s as if shouting ‘risk analysis’ over and over is any more effective than shouting ‘go 1.2 miles west’ to a tourist over and over. Hanging your communication hat on the necessity of others' understanding of your specialized vocabulary is a sure-fire way for your audience to get lost.
I propose that when dealing with audiences who are not risk practitioners you should do as you would with a non-native speaker: don’t expect them to know the nuances of a particular word or phrase and base everything you’re saying on that understanding. Instead, use a greater variety of words, use examples, draw pictures and use your gestures. Keep on doing that until it’s apparent that everyone in the room gets it and wants you to move on to the discussion and decisions at hand.
Of course, when communicating with risk peers in your sub-specialty, it is acceptable and necessary to use the terms and concepts appropriate to that sub-specialty.
On Authority
After I drafted this article, I happened to pick up the July 2014 issue of the Risk Analysis Journal. It contains the special series, “Foundational Issues In Risk Analysis”. The first paragraph of the first article, “Foundational Issues in Risk Assessment and Risk Management”, states, in part, “Lack of consensus on even basic terminology and principles, lack of proper scientific support, and justification of many definitions and perspectives lead to an unacceptable situation for operatively managing risk with confidence and success.” This statement comes from authors who are researchers in the field, one of whom is an area editor for the Risk Analysis Journal - in short, knowledgable people. With this situation being the case for a field that had its beginnings in the 1980’s, how likely and how important is it that your organization develops the perfect definition for these terms? It is probably not.
What I have seen work reasonably well is to settle on working terms collectively, under the leadership of the highest level risk management function in your organization. Yes, that means that the terms and principles they propose and that are ultimately adopted do not account for the nuances of your specialized risk area, but the alternative is that parts of the organization won’t effectively communicate with one another. That is worse, overall, than being stymied in your effort to translate the details of your speciality into business concerns.
Summary
Pick basic and simple definitions and move forward. In a few years, your organization just might iterate enough to arrive at rigorous and thorough definitions and, more importantly, to achieve an organization-wide understanding. Who knows? The field could settle on formal definitions for basic terms that work across organizations and sub-specialties at about the same time.
On Audience
We could speak the truest sentence ever spoken by using exactly the correct words, but in doing so with a non-native speaker visiting in a foreign land, it would be futile. This may sound absurd if we think of foreign tourists, but I have seen security and risk people do this often enough with non-practitioners to cringe. It’s as if shouting ‘risk analysis’ over and over is any more effective than shouting ‘go 1.2 miles west’ to a tourist over and over. Hanging your communication hat on the necessity of others' understanding of your specialized vocabulary is a sure-fire way for your audience to get lost.
I propose that when dealing with audiences who are not risk practitioners you should do as you would with a non-native speaker: don’t expect them to know the nuances of a particular word or phrase and base everything you’re saying on that understanding. Instead, use a greater variety of words, use examples, draw pictures and use your gestures. Keep on doing that until it’s apparent that everyone in the room gets it and wants you to move on to the discussion and decisions at hand.
Of course, when communicating with risk peers in your sub-specialty, it is acceptable and necessary to use the terms and concepts appropriate to that sub-specialty.
On Authority
After I drafted this article, I happened to pick up the July 2014 issue of the Risk Analysis Journal. It contains the special series, “Foundational Issues In Risk Analysis”. The first paragraph of the first article, “Foundational Issues in Risk Assessment and Risk Management”, states, in part, “Lack of consensus on even basic terminology and principles, lack of proper scientific support, and justification of many definitions and perspectives lead to an unacceptable situation for operatively managing risk with confidence and success.” This statement comes from authors who are researchers in the field, one of whom is an area editor for the Risk Analysis Journal - in short, knowledgable people. With this situation being the case for a field that had its beginnings in the 1980’s, how likely and how important is it that your organization develops the perfect definition for these terms? It is probably not.
What I have seen work reasonably well is to settle on working terms collectively, under the leadership of the highest level risk management function in your organization. Yes, that means that the terms and principles they propose and that are ultimately adopted do not account for the nuances of your specialized risk area, but the alternative is that parts of the organization won’t effectively communicate with one another. That is worse, overall, than being stymied in your effort to translate the details of your speciality into business concerns.
Summary
Pick basic and simple definitions and move forward. In a few years, your organization just might iterate enough to arrive at rigorous and thorough definitions and, more importantly, to achieve an organization-wide understanding. Who knows? The field could settle on formal definitions for basic terms that work across organizations and sub-specialties at about the same time.
Wednesday, August 6, 2014
Top 5 meta-findings from 12 years of security risk assessments in healthcare
My background: I have performed over 150 security risk assessments over the last 12 years, for organizations large and small, and for scopes as broad as an entire enterprise to as narrow as a single application, system, or vendor. Some of these assessments occurred within a day, some took months.
I’m writing this post in the hopes that:
* it can serve as a useful starting point for dialog within your organization about these issues
* enough people will read this that the prevalence of these findings will decrease over time
* my work performing risk assessment becomes more interesting and challenging over time
* I can remove all these meta-findings from my list 15 years from now
Risk assessments can contain all manner of findings, from the high-level policy issues to detailed technical issues. Corrections of the meta-findings that follow would significantly improve the effective management of all information security risks:
1. The "risk assessments” performed to date are actually compliance or control assessments. The organization (1) hasn’t complied with the HIPAA Security Rule and Meaningful Use requirements to perform risk assessment, and (2) has skipped the step that forms the fundamental basis for planning, thereby missing opportunities to efficiently use the organization's resources to appropriately and effectively protect patient and/or member data.
2. About 1/3 of the activities that are either universally important to effective security programs or needed to address the organization’s unique environment were overlooked because the consideration started and ended with an interpretation of the HIPAA Security Rule. The consideration only included the more directly worded CFR § 164.308 though 164.312. Specifically, the HIPAA Security Rule was misconstrued and misinterpreted because the entire preamble and CFR § 164.306 (a)(1) through (3) was skipped in the rush to quickly “be compliant.” 1
3. IT, Information Security, Facilities, Materials/Procurement, HR, Audit, and Compliance have distinct perspectives about information security, and these perspectives have not been harmonized, formalized, and agreed to. The organization as a whole lacks a uniform and coordinated approach and is missing a well-considered set of roles and responsibilities.
4. A large portion of the technical issues that the organization is experiencing is a result of processes or architectures that either do not exist or are poorly designed or implemented or are supported by functions that are understaffed. Technical tools intended to support security are under-utilized or improperly utilized. Much time is spent chasing specific resulting technical issues. The focus should be on identifying and correcting the organizational systems, business processes, personal incentives and (mis-aligned) accountabilities that create and perpetuate the technical issues.
5. Employed physicians, nurses and staff are not supporting security activities and policies because no one has explained in the language of their professions how their personal and individual missions can be put in jeopardy. Leaders, physicians with privileges, and sponsoring organizations have decision-making influence on business goals and risks. In the process, the information security risks are under-factored because they are explained in technical terms rather than in business terms.
In future posts, I will tackle some of these issues and provide recommendations for addressing them in your organization.
1 For those not familiar, CFR § 164.306 establishes "risks to ePHI" (not compliance) as the basis for all decision making related to security under the HIPAA Security Rule.
Subscribe to:
Posts (Atom)