We live in a digital world. The extent to which our lives are exposed to this ‘digitization’ is increasing exponentially. Whether we like it or not and whether we are involved in information technology related activities or not, our lives are getting more and more dependent on ‘digits’. Health records, tax records, saving and investment records, records of buying habits; practically everything that affects our life including how we are governed are getting digital.
With the way our lives are increasingly getting dependent upon information systems, the Internet being one of the most prominent examples, there is a growing concern about the security and reliability of this fragile infrastructure.[1]
In this digital world whatever business we are in, we cannot afford to ignore the impact of information security. To begin with, computer security has been left in the hands of “computer security experts,” chiefly technologists whose technical understanding qualified them to shoulder the responsibility of keeping computers and their valuable information safe. With stand alone computers the key security issues were how to protect the data from being lost or corrupt or stolen. [1]
But today information security is not just a technological problem although technology forms an important component. For business this is like any other problem of managing of risk and the cost associated with it. Like in any domain, the security experts can find a solution to address most of the risks (except those of cosmic proportions like Tsunami or Starburst) that a person or organization face. It is a question of the resources that you can throw behind the security risk and the extent of abstinence/ isolation that you are willing to suffer. Thus it becomes more of a managerial issue of identifying the risk areas, its probabilities, its impact and cost –benefit ratio of mitigation.
Organizations optimize themselves to minimize their risk, and understanding those motivations is key to understanding computer security today. However each of the above elements of risk management is not amenable to straight forward computations. It has high dependence of the human idiosyncrasies, mental make-up, domain knowledge etc
So when we look at information security management we have to use a larger framework; a framework that takes in to account business compulsions, nature of people and economics of incentives.
The span of managerial response ranges from apathy resulting from ignorance or indifference to paranoia resulting from ignorance or spinelessness. This posting covers a broad survey of the above spectrum to provoke some thoughts. I don’t expect this in any way to be prescriptive or comprehensive.
On one end some business managers are unable to understand the risk in the right perspective. Risk triggers in the nature of “fight or flight’ is an elementary component of any living organism. But many of the risks that modern man is exposed do not require such response. This means that there’s an evolutionary advantage to being able to hold off the reflexive fight-or-flight response while you work out a more sophisticated analysis of the situation and your options for dealing with it. Human beings have a completely different pathway to deal with analyzing risk called neo-cortex, a more advanced part of the brain that developed very recently and appears only in mammals. It is intelligent and analytic. It can reason. It can make more nuanced trade-offs. It’s also much slower. But it’s hard for the neo-cortex to contradict the primary response from the amygdale.[2]
Psychologist Daniel Gilbert has made brilliant explanation on this conflict “The brain is a beautifully engineered get-out-of-the-way machine that constantly scans the environment for things out of whose way it should right now get. That’s what brains did for several hundred million years—and then, just a few million years ago, the mammalian brain learned a new trick: to predict the timing and location of dangers before they actually happened.
Our ability to duck that which is not yet coming is one of the brain’s most stunning innovations, and we wouldn’t have dental floss or 401(k) plans without it. But this innovation is in the early stages of development. The application that allows us to respond to visible baseballs is ancient and reliable, but the add-on utility that allows us to respond to threats that loom in an unseen future is still in beta testing.”
The above gets compounded by what the psychologists term the ‘optimism bias’; we often think that accidents happened to only the other fellow and end up taking extreme risks. [2] Therefore unless the manager consciously tries to hold on to the primitive response and analyze the risk, the responses may often be not optimum.
This situation gets compounded when it comes to information security and software products. Most business managers are used to the incentive structure in the production of physical goods. If Honda produces a car with a systemic flaw they are liable, but Microsoft can produce an operating system with multiple systemic flaws per week and not be liable. Software companies have been able to institute a framework denying them liability for faulty products. [3]
Many business managers don’t appreciate this different paradigm in the digital world when they take decisions with respect to information security. With most of the corporate assets increasingly getting to be digital this becomes a critical issue.
We also see behavior on the other end of the spectrum. The security engineering community has, like the environmental science community, built-in incentives to overstate the problem. This could be like a firewall vendor struggling to meet quarterly sales targets, or a professor trying to mine the `cyber-terrorism' industry for grants, or the information security division lobbying for more funds and more power.
I still remember some of the overselling that used to happen during the Y2K compliance. Our consultant wanted certificates from practically all vendors, who provided any electronic good, to give us certificate that his product is Y2K compliant.
The business manager gets totally taken in by the scaremongers of digital fraud. In the name of information security he ends up overspending on every latest gadget and every vocal consultant. Human mind has a tendency to react to recent events and events that are high on visibility. With the extent of sound bites that we are exposed on information security we naturally end up overreacting. I still remember our Government grounding the total A 320 fleet of aircraft of Indian Airlines for a long time after the accident in Bangaluru.
What we need to develop is a balance between these two extremes. As Andrew Odlyzko noted in a paper titled Economics, Psychology, and Sociology of Security, “The natural resilience of human society suggests yet again the natural analogies between biological defense systems and technological ones. An immune system does not provide absolute protection in the face of constantly evolving adversaries, but it provides adequate defense most of the time. In a society composed of people who are unsuited to formally secure systems, the best we can hope to do is to provide “speed bumps” that will reduce the threat of cyber attacks to that we face from more traditional sources.” [4]
In a digital world the balanced view has to be continuously re-balanced as the rate of change of environment is extremely quick paced, unlike in case of the conventional technology areas. You get hardly any time to relax with the comfort of equilibrium that you seem to have managed.
Summary
The key thoughts from the above discussion can be summarized as below:
1. Information security is not just about technology it is about the managerial choice that is exercised
2. Whether gadget or process, it should justify the merit. The idea is not to foolproof, but to identify the appropriate balance.
3. Managing of this risk should be an inherent part of the total organizational process and not the functional responsibility of an expert group
4. It is not a one-time activity or a periodic activity. It is a continuous game of ‘cops and robbers’.
5. Keep in mind that security should not and need not always compromise convenience if it has to, make it as bearable as far as possible.
References
1. Kevein J Soo Hoo, How Much Is Enough? A Risk Management Approach to Computer Security
2. Bruce Schneier,The Psychology of Security , January 21, 2008
3. William Yurcik and David Doss, Illinois State University Department of Applied Computer Science “CyberInsurance: A Market Solution to the Internet Security Market Failure”,
4. Andrew Odlyzko, Economics, Psychology, and Sociology of Security
5. Ross Anderson and Taylor Moore, Information security Economics – and Beyond
6. Ross Anderson, Why Information Security is Hard – an Economic Perspective
The above references include specific references of observations as well as the articles that have provided ideas for my talk.
This is extracted from the talk I delivered at the conference held at IIMA