Anton Chuvakin discusses the known approaches to choosing the level of security for your organization, risk assessment, and finding the balance between effective security practices and the existing budget.

Scary stories of companies gutted by the worm attacks, virus outbreaks, malicious hacker intrusions and other information risks are plentiful. The rising exploitation of most existing vulnerabilities is combined with the appearance of new ones resulting in a new and powerful threat to Internet users (see, for example, 2001 CSI/FBI Survey, a standard cyber crime statistical reference). The security industry often takes the position that companies should promptly design and implement more and more security, increase its priority within the business and train all computer users in using the multiple layers of enterprise defense. Similar advice is given to small office, home office and individuals using the Internet from home. Firewall or at least a filtering router is now considered a standard network protective measure for any company. Personal firewalls are recommended for all home users on broadband Internet connections. However, companies will often choose to comply only with the reduced set of requirements due to business needs.

So far, it sounds like the beginning of a typical security convincer speech. In fact, the rarely addressed issue of choosing an appropriate level of security for the organization is about to be investigated. All organizations are different; they have different needs (that includes security needs), resources (that includes human, financial and technology resources) and, on the human side of the problem, different perceptions of "how much is enough" security. The article addresses common approaches to information technology risk management and discusses some of the issues that can simplify the assessment and mitigation of these risks. It summarizes existing knowledge in the area and gives some useful hints on IT risks. It should be noted, that by enforcing security we understand maintaining the classic CIA: Confidentiality, Integrity, Availability.

Many experts claim that an average company has "too little" security. However, if you ask the CTO you will sometimes hear that they have "just enough". Apparently, their business process was going on just fine with whatever level of security they had. As someone said, selling security to management is sometimes like asking to pay real many for hypothetical (unlike fire, flood, earthquake) risks. Often, company chooses to implement security only after the major incident strikes them or their competitors, just as emergency planning was brought to light after September 11.

What are the known approaches for choosing the level of security? Here are a few of the common ones discussed in a security literature.

  • "No security"
  • used to be a viable option for smaller companies, which considered their information assets as being of little interest to attackers. Believing that "we are too small to be attacked" has proved to be an inherently flawed logic. The popularity of random IP address scanning tools employed by "script kiddies"*, mass deployment of DDoS zombies and recent rise of IIS worms makes this option a perilous choice. Whoever owns a computer does have something of interest to malicious hackers - it can be the hard drive space for hosting a "warez" site, processor power for cracking passwords from other systems or network connection for attacking third parties. In addition, while the threat of "due diligence" lawsuits has not materialized yet, it is often quoted as a motivation for implementing basic security measures.

  • "As much as management allows"
  • is usually simply too little - too late. After the next well-publicized bug story or a security incident, the company decides to implement a certain security measure, such as deploying a firewall. It might or might not be helpful for overall enterprise security, since technology safeguards implemented without the foundation of a solid security policy are rarely helpful. For example, having a firewall but allowing most of the potentially dangerous protocols through due to "company needs" does not lower the risk by any significant amount.

  • "Qualitative risk assessment"
  • , while favored by many in security industry, is a rather controversial method due to the high degree of uncertainty associated with some IT risks. Several techniques were developed for defining the appropriate level of security spending based on risk. One way is to find risk level by multiplying the threat frequency by the loss amount. Then the resulting amount is compared with the value of protected asset. It is sometimes referred to as Annual Loss Expectancy method and goes back to the FIPS publication 65 from 1979. The problem is that often both loss frequency and amount are totally unknown in many cases for digital risks. For some risk factors, there is not enough statistical data accumulated, some are usually not reported to the authorities, while others are too new and unpredictable. At whatever level of digital risk assessment currently is, it will be needed when digital risks insurance finally catches up since that is the way insurance companies seem to operate.

  • "Best practices"
  • is another often mentioned method to manage company information technology risks. It avoids the uncertainty of a formal risk analysis by relying on the commonly accepted baseline for security protection. Projects such as CASPR (Commonly Accepted Security Practices & Recommendations) or GASSP (Generally Accepted System Security Principles) are aimed at developing the common set of guides on all areas of security. For example, CASPR uses the following topics to organize the developed guides: Operating System and System Security, Network and Telecom Security, Access Control and Authentication, Infosecurity Management, Disaster Recovery and Business Continuity, Infosecurity Auditing and Assessment, Logging and Monitoring, Application Security, Application and System Development, Computer Operations, Investigations and Forensics and Cryptography. By summarizing the methods and techniques to handle technology risks from the majority of industry players, the documented best practices can effectively protect the company that adopts them from attacks and liability suits. Implementing this approach requires the minimum amount of analysis and planning. British standard BS 7799 that recently became the international standard ISO 17799 is another example of the best practices guides. A lot of information is available on the web, for a nice overview and a short resource list, visit the ISO17799. It should be noted that using best practices provides better protection from liability lawsuits than from the actual network intrusions. Some other disadvantages of utilizing this approach are the independence from actual risks that can lead to both over-securing and under-securing, lack of information collection on new threats and difficulties in measuring the efficiency of security procedures.

  • "Scenario analysis approaches"
  • involve creation of various scenarios in which computer security can be compromised. After the threat scenario is created the appropriate mitigation procedure should be developed, deployed and tested. Scenarios can also be used to demonstrate how vulnerable the company is to certain threat factors: for example, penetration testing can show that the company is exposed to insider risks and that the methodology to handle them should be built. This approach requires brainstorming of as many risks as possible done by internal security staff and outside security consultants. Unfortunately, all hazards can never be brought to light. It limits the effectiveness of the scenario-based approach. In case an important and high-probability threat factor is missed in the methodology, the financial loss can be severe. In brief, under this approach the security program will address only those risks for which loss scenarios are developed.

  • Cost-benefits analysis
  • is an attempt to base the choice of security safeguards on the asset cost alone. Apparently, deploying a $10,000 to protect $1,000 information on a server is not a wise decision no matter what the risk is. Cost-benefit analysis allows drawing a line limiting the price of projected security technologies. The cost of protected assets can be used to determine the security requirements, which are than compared with organization technological and human resources. This does not involve any risk assessment, thus the question whether the asset is actually at risk is never asked. However, this method certainly provides a useful way to start the security infrastructure design.

  • Insuring all risks
  • might become a promising option for companies that cannot afford designing a full-blown security architecture in the near future. It might not be the most effective option since it just transfers the need to do assessment to the insurance company. Likely, some combination of managed risk and insured risk will eventually become a standard.

Overall, the knowledge of the above methods can help you increase the safety of doing business. Risk assessment is not an esoteric process. Let us apply all of the above methods to a simple IT risk. To those in the know, the example might look simplistic, but it does serve as a useful illustration. For example, security administrator of the site is asked to let the instant messenger (such as ICQ, AIM or MSN Messenger) communication though the firewall. The case of no security is trivial - the protocol is allowed and no second thought is given to the issue. Qualitative risk assessment is complicated: there is no statistical data on IM risks to the company and only the anecdotal evidence is available. Thus, it is impossible to evaluate the probability and the amount of loss, unless one chooses to trust the risk evaluation based on insufficient data. Best practices approach will call for checking the industry guidelines on allowing IM communication through. In this case, most of the guides will advise against doing this, unless a clear business need for IM use is established. Scenario-based approach will look for all ways to abuse IM and will make provisions for responding to each of them. Security team might want to demonstrate how IM is used as an attack vector and then create appropriate safeguards if they are possible. In case ways to cause loss, using IM cannot be handled by security procedures, the software is not allowed. Cost-benefit analysis might determine that the savings from less phone calls that stem from using IM software will be bigger than endangered assets in engineering department, while in case of accounting department the risks might outweigh the benefits. Thus, IM will be allowed in one case and banned in another.

In addition, several important points related to risk strategies should be made.

  • Whatever method of handling risks is implemented, security policy should relate the resources with corresponding protection measures. Thus, policy is written after all organization resources are evaluated and risk assessment (if it is required by the chosen risk mitigation model) is completed to establish the guidelines to be followed by the company. Policy should also be reviewed regularly to reflect the changes in the enterprise and security environment. After an appropriate level of security is carefully chosen and fixed by the written policy, the problem of convincing users to use it remains.

  • Another important requirement is to keep current with all the security-related information, since new vulnerabilities are found every day. Malware attacks happen with no advance warning and can be initiated by strangers or disgruntled employees seeking revenge. Malicious hackers need only know a single vulnerability to get through your defenses. And you have to defend against all of them. It requires following the changes in security technology, IT industry and even hacker underground. Once you know the enemy you must still implement a defense strategy.

  • Security awareness program for employees should be a crucial part in any infosec policy. Unless the employees are trained in using the deployed security technologies, the effectiveness of those controls will fall apart. In addition, people responsible for critical computing resources should complete a more rigorous infosecurity training to understand all of the security implications of their job functions. Enterprise security education should be kept up to date with technology and current best practices. One possible way to assure the adequate level of cybersecurity knowledge is periodic tests and quizzes based on current policies and procedures accepted by the management.

Finding the balance between effective security practices and the existing budget for security is important for every environment. The above guidelines should send the reader in the right direction.

When security program is designed, the unfortunate consequence of security measures should be considered. Overall, most security measures hinder usability to some extent. This is a fact, and so far, no workarounds have been developed. We divide all "complications" from security into impact on the information systems and impact on employees. Some of the computer-related issues are:

  • Antivirus software might degrade performance of a desktop system, especially for older computers *Content filtering may remove non-malicious attachment *Firewall may slow down network communication
  • False positives from the IDS and anti-virus software heuristic engines create network overhead and undermine employee confidence and value in such security software.

Employee related security challenges include:

  • Complicated password policy with no user education, resulting in employees writing passwords on infamous Post-It notes

  • Need to login to many systems to perform daily job duties results in employees selecting the same password or several easily guessable passwords for multiple systems

  • The previous need combined with short system login timeouts causes productivity loss due to a constant need to login

  • Restrictive physical security may result in employees sharing access cards

  • Video surveillance, with no justification, may lower employee morale and degrade trust within an organization

  • Complex remote access procedures may result in some employees installing "black" modems for home access

  • Encrypted email, such as PGP email is hard to use without some level of understanding of public key cryptography

Addressing those issues is essential after the security program is designed.

Ideally, security should be user-independent (security of the system should not depend upon the decision of an end-user), user-transparent (does not prevent or hinder any authorized action of the user), effective! (stop all unauthorized actions of a legitimate user and all actions of an intruder) and cost-effective (not cost more than the protected assets). Security measures should also be flexible to reflect a fast-paced and somewhat chaotic environment of the modern infosec threat landscape.

To conclude, too much security can be as much of a problem in some cases as too little. Restrictive and unjustified security measures especially those not based on a security policy can lower productivity of human and performance of technology components of a business. Implementing effective security requires careful design, and a need analysis and detailed risk analysis should be done first. Such assessments are then followed up with an implementation plan, where organizational communication, policy, maintenance plans, training and deployment are considered (to name a few).

About the Author

Anton Chuvakin, Ph.D. is a Senior Security Analyst with netForensics ( ), security information management company that provides real-time forensics software solutions. His infosecurity expertise includes network security, firewalling, UNIX hardening, security administration, etc.