Showing posts with label Risk Factors. Show all posts
Showing posts with label Risk Factors. Show all posts

Friday, May 21, 2010

Why have there been so many Natural Catastrophes of late?

The Freakanomics blog reports that Foreign Policy magazine has responded to concerns that we are living in particularly harrowing times, experiencing more than our share of natural disasters. But FP reports that we are not. What we actually have is a heightened awareness that these events are occurring thanks to rapid and prolonged media coverage.

Based on U.S. Geological Survey records dating back to 1900, the Earth experiences 16 major earthquakes per year on average, where a major quake is one whose magnitude is 7.0 or more. There were only 6 major quakes in 1986 but 32 in 1943. And this year? 6 so far, so we might be headed for an above average year, but not an extreme year. However there is an increase in the loss of life from earthquakes (650,000 people last decade) due to the expansion of urban sites into fault zones. This is another factor which increases media coverage of these tragedies.

Tuesday, February 23, 2010

Major Risks in the IT Industry

Researchers at the University of Wisconsin, from the Actuarial and Insurance department, conducted a study on risk terms in 2007. The study involved comparing notions and definitions of various terms in risk across several sectors and industries, including Information Technology (IT). The IT respondents listed the following major risks for their industry (click to enlarge)

image

The researchers noted that the IT sector had the largest number of risks. By way of comparison, the major risks for the energy industry looked like this

image

Notice that IT Failure risk is on the list but very much towards the bottom.

Wednesday, February 4, 2009

Financial Cyber Risk Guide from ANSI

In October last year ANSI released a new guide addressing the financial impact of cyber risks. From the title you may expect lengthy calculation is costing cyber risks but in fact the document is largely a set of question to create a dialogue around cyber risks. This is not a consolation prize. I have written a short summary of the document which you can read from Scribd below. You can also read a quick review from the Security4all blog.

ANSI approach to the financial impact of cyber risk

Tuesday, December 2, 2008

One More Risk Profile Graph

I recently posted a collection of risk graphs that I found through Google image search. There was one graph that I wanted to include but could not find again until this morning. It was produced as part of Dutch study on work-related stress in the police force. The study took the approach of identifying the main risk factors in workers' psychological profiles that impact work-related stress. The risk profile below shows a Tornado graph derived from interviewing several thousand workers in 1999 and then again five years later in 2005.

image

Actions were taken to reduce the most significant risk factors (rated as unfavourable on the right) which included work satisfaction, intention to leave the job, relation at work, feedback and quantitative job demands (overwork?). On the other hand, some already favourable risk factors were improved further.

The graph is neither colourful nor visually striking (easy to fix) yet I like the representation in terms of risk factors. In fact I now believe that identifying and rating the main contributing risk factors is one of the best approaches to risk analysis. I see risk factors as the basic variables in a risk model that need not be instantiated further. One could attempt to quantify and combine the risk factors above, but in my experience this exercise would prove difficult to justify and likely to be of little additional value beyond identification of the risk factors themselves.

I recently posted on risk factors for identifying malware, based on a patent application for risk-based scanning by Kapersky. Though many people disagreed that the patent would be useful, again it was the risk factor decomposition that interested me. In many instances of IT risk, the mere process of identifying and rating risk factors will bring the most value.

Related Posts

Monday, November 17, 2008

Examples of Risk Profile Graphs

I was looking through the current set of risk presentations on slideshare, and found an interesting one called Creating Risk Profile Graphs, adapted from an article on managing project risks by Mike Griffiths over at the Leading Answers blog. The approach taken is to define and rate project risk factors and then plot their development over time. The graph below (double click for a better view) shows the profile of 10 technology risks over 4 months charted into Excel as "stacked" area graphs. There is another view on the same risk factors here.

risk_profile_graph

The risk factors measured include JDBC driver performance, calling Oracle stored procedures via a web service, and legacy system stability. The project manager rates each risk factor monthly for severity and probability to produce a numerical measure of the risk factor. Plotted over time we see 4 persistent project risks (the steel and light blue, tan and pink plots).

I was not sure if this was a good way to create a risk profile, so I did a Google image search to see what else was out there. The results were very interesting, and reflect the varying approaches risk managers use to present their results.

This first risk graph really leapt off the search page results, if for no other reason than the prominent pink silhouette of Dilbert's manager in the middle. I must confess that the meaning of the graph was not apparent, but luckily its part of an article called Risk Management for Dummies (click and scroll down) where it is explained to be a depiction of the peaks and troughs in the risk exposure pipeline. The graph is tracking the cost of risks over time.

r1

The next profile is a common, if not traditional, format for risks, taken from the integrated risk management guide of the Canadian Fisheries and Oceans department. This is also called a risk rating table, risk map or risk matrix. Likelihood is represented on the horizontal axis and impact (severity) on the vertical axis.

image

In the first two graphs these two dimensions were combined into a single risk measure (vertical) and then plotted over time (horizontal). The graph above has no representation of time and risks are rated against their current status red, yellow and green regions in the table correspond to measures of high, medium and low risk respectively, and the exact choice of the boundaries is flexible but normally fixed across a company. The table is 5 x 5 but it is also common to have 4 x 4, 5 x 5, 6 x 4, and so on. Below is a 3 x 3 variant of the risk rating table with the high risk region moved to the upper left corner rather than the traditional upper right corner. Note that there is an additional qualitative region and that the regions do not respect the table format. This one is also about fish and oceans, and also from those Canadians but part of another guide.

image

The risk rating or profile table below is quite close to the format used in my company, taken from a recent annual report of a healthcare company. Again there are the two dimensions of likelihood and severity. There are 6 risks in the table represented by current rating and target rating after mitigation actions have been performed (the arrows show the improvement of each risk). Note that the mitigation of risk 6 yields no improvement, a reasonably common occurence in practice.

image

Note also that the red, yellow and green have been replaced by different shades of the single colour blue. The colour red has such a strong association with danger that using it in a risk profile is often counterproductive since all attention is drawn to the red region. In the above graph different shades of blue have been used to represent the relative profiles of risks.

The next risk profile is more graphic than graph, and communicates that concentrating your portfolio in shares is high risk, while short term deposits are low risks. It is taken from an investment article in a newspaper. This graph(ic) is not very sophisticated but it does get its main point across.

image

The final profile is more quantitative. The graph rates probability against result (impact), which can be positive (upside), neutral or negative (downside). It is from a post on the risk of global warming. In this case the upside seems rather limited while the negative impact tail extends much further. So the profile of this risk is dominated by its downside. The fat tail of the downside is hard to capture in a rating table for example.

image

Related Post

Wednesday, October 22, 2008

Risk Factors for AV Scanning

repair If you work in a large company then you probably are familiar with the all-too-regular process of your desktop AV software performing a scheduled full disk scan. This may happen a few times a month, and during the scan (which may last a few hours) you typically experience quite sluggish performance. You may also get the sinking feeling that most files are being needlessly scanned (again). Kaspersky, a security software vendor that includes AV, gets that feeling as well. Treating all files on your desktop as equally likely to contain malware is wasteful, but without any criteria to discern less-likely from more-likely malware candidates, the current regime remains. Kaspersky observes that users are only willing to wait a few seconds at the outside for AV to perform its checks, and typically AV scans are limited to what can be done in these windows of user-defined expectations.

Kapersky has decided to make AV scanning more efficient not by making it faster but by doing less, as determined by risk-based criteria. And they were recently issued a US patent for this approach. If you have been looking for a simple example to highlight the difference between traditional security and risk-based security then your search is over.

Be warned that the patent is repetitive and not very clearly written, in keeping with the style of such documents. Patents are lawyers' solution to the legal optimisation problem of being both vague (to support the broadest claims) and specific (giving details in an embodiment that demonstrates the invention). Also towards the end, beginning with the paragraph describing FIG. 2, you can read the convoluted legalese required to define a "computer" and a "network".

Trading Risk against Speed

The purpose of the patent is "balancing relatively quick (but less thorough) anti-malware checks with more thorough, but also more time-consuming, anti-malware checks". The basic approach is to employ different scanning strategies for known files that have been previously scanned, and new files whose status is unknown to the AV software. Known files will receive a quick signature-based scan, while unknown files may be subject to more detailed scans.

When unknown executable files are first launched a risk assessment is performed to determine the appropriate level of scanning. The risk assessment evaluates a collection of risk factors that produces a metric which determines whether to rate the file as having a high, medium or low risk of containing malware. This rating in turn determines the thoroughness of the scan to be performed on the file. Options for a more detailed anti-malware scan can include heuristics analysis, emulating file execution (in an isolated environment, and steeping through particular instructions) or the statistical analysis of instruction patterns. Beyond local checks, the AV software may also consult online scanning services for additional information. Employing these more sophisticated scanning methods increases the rate of detection at a cost of additional processing time.

Risk Factors

The patent provides some example risk factors for the purpose of satisfying the embodiment requirement of the invention. While the risk factors are intended only as examples, they are interesting nonetheless.

Online Status

The patent mentions that it can take between 15 minutes to 2 hours to update local AV databases when new malware appears. It is suggested to contact an AV server directly to obtain the latest information available. If the executable is sitting on a blacklist then it is likely to be malware (that's why its on the list), and if its on a whitelist then the likelihood of malware being present is low.

File Origin

If the origin of the file is a storage medium such as CD or DVD then it it less likely to have malware than if the software was distributed over the internet. Email attachments are always suspicious. For downloaded files, the URL source of the download should be considered to determine if the origin is a suspicious web site or P2P network.

File Compression

Malware is now commonly compressed (packed) to thwart signature-based methods of virus detection. Packed files should be treated as being more suspicious than unpacked (plain) files (see here for a general discussion on how malware author use packing to hide their payloads).

File Location

The current location and/or path to the file can also be considered, since some executable files install themselves in a particular directory, especially those directories that are infrequently used. For example, the Temporary Internet Files folder is a higher risk than the My Documents folder.

File Size

Relatively small executable files executed are more suspicious than a large executable files since propagating malware does not want to draw attention to itself by transferring large files. Sending a large number of emails with a relatively small attachment is much more practical. Kaspersky states that files sent out in this manner are on the order of 50-100 kilobytes (which, if packed, reduces to something on the order of 20-50 kilobytes).

Installer File

Malware often propagates by sending out small installer files that when executed triggers a process of downloading a much larger malware payload from a web server or a file server on the Internet.

Digital Signature

Files that are digitally signed are less likely to contain malware than unsigned files.

Surprisingly the patent also mentions the possibility of alerting the user with a popup that gives them the option to skip or minimize the scan of the unknown file. The patent states that "as yet a further option, the user can manually choose to run some of the anti-virus scans in the background after the new software has been launched, but not necessarily the entire spectrum of available technologies, which obviously increases the risk that a virus can infect the computer" (italics added).

Risk supporting Business

The idea of Kaspersky is to vary malware scanning sophistication based on well-defined risk-factors. Presumably they have a sufficiently large data set on these risk factors to facilitate a transformation into hard (numeric) decision criteria. The patent does not describe how the various risk factors will be combined to produce a risk decision, but much tuning will be required.

Note that the purpose of the patent is not to reduce the risk of being infected by malware but to provide (security) risk support for the decision to improve user response times by reducing the overall effort for malware scanning. And this is clearly the task of the IT Security Risk function - balancing business and security requirements using risk analysis.

But to be clear, we don't get something for nothing, and the likelihood of being infected by malware will actually increase simply because less scanning will be done and the risk factors will not correlate perfectly with the presence of malware. And this is the next important responsibility of the IT Security Risk function - to make business aware of the residual risk in following the Kaspersky approach and getting acceptance for this risk. Hopefully Kaspersky will provide some data to help here.

I posed the question on LinkedIn whether people would support deploying this type of AV, and the resounding answer was no.

Related Posts