Risk
Risk is the potential of gaining or losing something of value.[1] Values (such as physical health, social status, emotional well-being or financial wealth) can be gained or lost when taking risk resulting from a given action or inaction, foreseen or unforeseen. Risk can also be defined as the intentional interaction with uncertainty. Uncertainty is a potential, unpredictable, and uncontrollable outcome; risk is a consequence of action taken in spite of uncertainty.[2]
Risk perception is the subjective judgment people make about the severity and probability of a risk, and may vary person to person. Any human endeavor carries some risk, but some are much riskier than others.[3]
danger (plural dangers)
- Ability to harm; someone's dominion or power to harm or penalise.
- Exposure to liable harm.
- An instance or cause of liable harm.
- Mischief.
Contents
- 1 Definitions
- 2 Practice areas
- 2.1 Economic risk
- 2.2 Health
- 2.3 Health, safety, and environment
- 2.4 Information technology and information security
- 2.5 Insurance
- 2.6 Business and management
- 2.7 In human services
- 2.8 High reliability organizations (HROs)
- 2.9 Finance
- 2.10 Security
- 2.11 Human factors
- 2.12 Psychology of risk taking
- 2.13 Maintenance
- 3 Risk assessment and analysis
- 4 Anxiety, risk and decision making
- 5 Risk in auditing
- 6 List of related books
- 7 See also
- 8 References
- 9 Bibliography
- 10 External links
Definitions
(Exposure to) the possibility of loss, injury, or other adverse or unwelcome circumstance; a chance or situation involving such a possibility.[4]
- Risk is an uncertain event or condition that, if it occurs, has an effect on at least one [project] objective. (This definition, using project terminology, is easily made universal by removing references to projects).[5]
- The probability of something happening multiplied by the resulting cost or benefit if it does. (This concept is more properly known as the 'Expectation Value' or 'Risk Factor' and is used to compare levels of risk)
- The probability or threat of quantifiable damage, injury, liability, loss, or any other negative occurrence that is caused by external or internal vulnerabilities, and that may be avoided through preemptive action.
- Finance: The possibility that an actual return on an investment will be lower than the expected return.
- Insurance: A situation where the probability of a variable (such as burning down of a building) is known but when a mode of occurrence or the actual value of the occurrence (whether the fire will occur at a particular property) is not.(Reference needed) A risk is not an uncertainty (where neither the probability nor the mode of occurrence is known), a peril (cause of loss), or a hazard (something that makes the occurrence of a peril more likely or more severe).
- Securities trading: The probability of a loss or drop in value. Trading risk is divided into two general categories: (1) Systematic risk affects all securities in the same class and is linked to the overall capital-market system and therefore cannot be eliminated by diversification. Also called market risk. (2) Non-systematic risk is any risk that isn't market-related. Also called non-market risk, extra-market risk or diversifiable risk.
- Workplace: Product of the consequence and probability of a hazardous event or phenomenon. For example, the risk of developing cancer is estimated as the incremental probability of developing cancer over a lifetime as a result of exposure to potential carcinogens (cancer-causing substances).
International Organization for Standardization
The ISO 31000 (2009) / ISO Guide 73:2002 definition of risk is the 'effect of uncertainty on objectives'. In this definition, uncertainties include events (which may or may not happen) and uncertainties caused by ambiguity or a lack of information. It also includes both negative and positive impacts on objectives. Many definitions of risk exist in common usage, however this definition was developed by an international committee representing over 30 countries and is based on the input of several thousand subject matter experts.Other
Very different approaches to risk management are taken in different fields, e.g. "Risk is the unwanted subset of a set of uncertain outcomes" (Cornelius Keating).- Risk can be seen as relating to the probability of uncertain future events.[6] For example, according to factor analysis of information risk, risk is:[6] the probable frequency and probable magnitude of future loss. In computer science this definition is used by The Open Group.[7]
- OHSAS (Occupational Health & Safety Advisory Services) defines risk as the combination of the probability of a hazard resulting in an adverse event, and the severity of the event.[8]
- In information security risk is defined as "the potential that a given threat will exploit vulnerabilities of an asset or group of assets and thereby cause harm to the organization".[9]
- Financial risk is often defined as the unpredictable variability or volatility of returns, and this would include both potential better-than-expected and worse-than-expected returns. References to negative risk below should be read as also applying to positive impacts or opportunity (e.g. for "loss" read "loss or gain") unless the context precludes this interpretation.
Practice areas
Risk is ubiquitous in all areas of life and risk management is something that we all must do, whether we are managing a major organization or simply crossing the road. When describing risk however, it is convenient to consider that risk practitioners operate in some specific practice areas.Economic risk
Economic risks can be manifested in lower incomes or higher expenditures than expected. The causes can be many, for instance, the hike in the price for raw materials, the lapsing of deadlines for construction of a new operating facility, disruptions in a production process, emergence of a serious competitor on the market, the loss of key personnel, the change of a political regime, or natural disasters.Health
Risks in personal health may be reduced by primary prevention actions that decrease early causes of illness or by secondary prevention actions after a person has clearly measured clinical signs or symptoms recognized as risk factors. Tertiary prevention reduces the negative impact of an already established disease by restoring function and reducing disease-related complications. Ethical medical practice requires careful discussion of risk factors with individual patients to obtain informed consent for secondary and tertiary prevention efforts, whereas public health efforts in primary prevention require education of the entire population at risk. In each case, careful communication about risk factors, likely outcomes and certainty must distinguish between causal events that must be decreased and associated events that may be merely consequences rather than causes.In epidemiology, the lifetime risk of an effect is the cumulative incidence, also called incidence proportion over an entire lifetime.[10]
Health, safety, and environment
Health, safety, and environment (HSE) are separate practice areas; however, they are often linked. The reason for this is typically to do with organizational management structures; however, there are strong links among these disciplines. One of the strongest links between these is that a single risk event may have impacts in all three areas, albeit over differing timescales. For example, the uncontrolled release of radiation or a toxic chemical may have immediate short-term safety consequences, more protracted health impacts, and much longer-term environmental impacts. Events such as Chernobyl, for example, caused immediate deaths, and in the longer term, deaths from cancers, and left a lasting environmental impact leading to birth defects, impacts on wildlife, etc.Over time, a form of risk analysis called environmental risk analysis has developed. Environmental risk analysis is a field of study that attempts to understand events and activities that bring risk to human health or the environment.[11]
Human health and environmental risk is the likelihood of an adverse outcome (See adverse outcome pathway). As such, risk is a function of hazard and exposure. Hazard is the intrinsic danger or harm that is posed, e.g. the toxicity of a chemical compound. Exposure is the likely contact with that hazard. Therefore, the risk of even a very hazardous substance approaches zero as the exposure nears zero, given a person's (or other organism's) biological makeup, activities and location (See exposome).[12]
Information technology and information security
Main article: IT risk
Information technology risk, or IT risk, IT-related risk, is a risk related to information technology.
This relatively new term was developed as a result of an increasing
awareness that information security is simply one facet of a multitude
of risks that are relevant to IT and the real world processes it
supports.The increasing dependencies of modern society on information and computers networks (both in private and public sectors, including military)[13][14][15] has led to new terms like IT risk and Cyberwarfare.
Main articles: Information assurance and Information security
Information security means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction.[16] Information security grew out of practices and procedures of computer security.Information security has grown to information assurance (IA) i.e. is the practice of managing risks related to the use, processing, storage, and transmission of information or data and the systems and processes used for those purposes.
While focused dominantly on information in digital form, the full range of IA encompasses not only digital but also analog or physical form.
Information assurance is interdisciplinary and draws from multiple fields, including accounting, fraud examination, forensic science, management science, systems engineering, security engineering, and criminology, in addition to computer science.
So, IT risk is narrowly focused on computer security, while information security extends to risks related to other forms of information (paper, microfilm). Information assurance risks include the ones related to the consistency of the business information stored in IT systems and the information stored by other means and the relevant business consequences.
Insurance
Insurance is a risk treatment option which involves risk sharing. It can be considered as a form of contingent capital and is akin to purchasing an option in which the buyer pays a small premium to be protected from a potential large loss.Insurance risk is often taken by insurance companies, who then bear a pool of risks including market risk, credit risk, operational risk, interest rate risk, mortality risk, longevity risks, etc.[17]
Business and management
Means of assessing risk vary widely between professions. Indeed, they may define these professions; for example, a doctor manages medical risk, while a civil engineer manages risk of structural failure. A professional code of ethics is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).In the workplace, incidental and inherent risks exist. Incidental risks are those that occur naturally in the business but are not part of the core of the business. Inherent risks have a negative effect on the operating profit of the business.
In human services
The experience of many people who rely on human services for support is that 'risk' is often used as a reason to prevent them from gaining further independence or fully accessing the community, and that these services are often unnecessarily risk averse.[18] "People's autonomy used to be compromised by institution walls, now it's too often our risk management practices", according to John O'Brien.[19] Michael Fischer and Ewan Ferlie (2013) find that contradictions between formal risk controls and the role of subjective factors in human services (such as the role of emotions and ideology) can undermine service values, so producing tensions and even intractable and 'heated' conflict.[20]High reliability organizations (HROs)
A high reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity. Most studies of HROs involve areas such as nuclear aircraft carriers, air traffic control, aerospace and nuclear power stations. Organizations such as these share in common the ability to consistently operate safely in complex, interconnected environments where a single failure in one component could lead to catastrophe. Essentially, they are organizations which appear to operate 'in spite' of an enormous range of risks.Some of these industries manage risk in a highly quantified and enumerated way. These include the nuclear power and aircraft industries, where the possible failure of a complex series of engineered systems could result in highly undesirable outcomes. The usual measure of risk for a class of events is then: R = probability of the event × the severity of the consequence.
The total risk is then the sum of the individual class-risks; see below.[citation needed]
In the nuclear industry, consequence is often measured in terms of off-site radiological release, and this is often banded into five or six decade-wide bands.[clarification needed]
The risks are evaluated using fault tree/event tree techniques (see safety engineering). Where these risks are low, they are normally considered to be "broadly acceptable". A higher level of risk (typically up to 10 to 100 times what is considered broadly acceptable) has to be justified against the costs of reducing it further and the possible benefits that make it tolerable—these risks are described as "Tolerable if ALARP". Risks beyond this level are classified as "intolerable".
The level of risk deemed broadly acceptable has been considered by regulatory bodies in various countries—an early attempt by UK government regulator and academic F. R. Farmer used the example of hill-walking and similar activities, which have definable risks that people appear to find acceptable. This resulted in the so-called Farmer Curve of acceptable probability of an event versus its consequence.
The technique as a whole is usually referred to as probabilistic risk assessment (PRA) (or probabilistic safety assessment, PSA). See WASH-1400 for an example of this approach.
Finance
Main article: Financial risk
In finance,
risk is the chance that the return achieved on an investment will be
different from that expected, and also takes into account the size of
the difference. This includes the possibility of losing some or all of
the original investment. In a view advocated by Damodaran, risk includes
not only "downside risk" but also "upside risk" (returns that exceed expectations).[21] Some regard the standard deviation of the historical returns or average returns of a specific investment as providing some historical measure of risk; see modern portfolio theory.
Financial risk may be market-dependent, determined by numerous market
factors, or operational, resulting from fraudulent behavior (e.g. Bernard Madoff). Recent studies suggest that endocrine levels may play a role in risk-taking in financial decision-making.[22][23]A fundamental idea in finance is the relationship between risk and return (see modern portfolio theory). The greater the potential return one might seek, the greater the risk that one generally assumes. A free market reflects this principle in the pricing of an instrument: strong demand for a safer instrument drives its price higher (and its return correspondingly lower) while weak demand for a riskier instrument drives its price lower (and its potential return thereby higher). For example, a US Treasury bond is considered to be one of the safest investments. In comparison to an investment or speculative grade corporate bond, US Treasury notes and bonds yield lower rates of return. The reason for this is that a corporation is more likely to default on debt than the U.S. government. Because the risk of investing in a corporate bond is higher, investors are offered a correspondingly higher rate of return.
A popular risk measure is value-at-risk (VaR).
There are different types of VaR: long term VaR, marginal VaR, factor VaR and shock VaR. The latter is used in measuring risk during the extreme market stress conditions.
In finance, risk has no single definition.
Artzner et al.[24] write "we call risk the investor's future net worth". In Novak [25] "risk is a possibility of an undesirable event".
In financial markets, one may need to measure credit risk, information timing and source risk, probability model risk, and legal risk if there are regulatory or civil actions taken as a result of "investor's regret".
It is not always obvious if financial instruments are "hedging" (purchasing/selling a financial instrument specifically to reduce or cancel out the risk in another investment) or "speculation" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).
Some people may be "risk seeking", i.e. their utility function's second derivative is positive. Such an individual willingly pays a premium to assume risk (e.g. buys a lottery ticket). Knowing one's risk appetite in conjunction with one's financial well-being are important.
Security
Human factors
Main articles: Decision theory and Prospect theory
One of the growing areas of focus in risk management is the field of human factors
where behavioral and organizational psychology underpin our
understanding of risk based decision making. This field considers
questions such as "how do we make risk based decisions?", "why are we
irrationally more scared of sharks and terrorists than we are of motor
vehicles and medications?"In decision theory, regret (and anticipation of regret) can play a significant part in decision-making, distinct from risk aversion[27] (preferring the status quo in case one becomes worse off).
Framing[28] is a fundamental problem with all forms of risk assessment. In particular, because of bounded rationality (our brains get overloaded, so we take mental shortcuts), the risk of extreme events is discounted because the probability is too low to evaluate intuitively. As an example, one of the leading causes of death is road accidents caused by drunk driving – partly because any given driver frames the problem by largely or totally ignoring the risk of a serious or fatal accident.
For instance, an extremely disturbing event (an attack by hijacking, or moral hazards) may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable. These human tendencies for error and wishful thinking often affect even the most rigorous applications of the scientific method and are a major concern of the philosophy of science.
All decision-making under uncertainty must consider cognitive bias, cultural bias, and notational bias: No group of people assessing risk is immune to "groupthink": acceptance of obviously wrong answers simply because it is socially painful to disagree, where there are conflicts of interest.
Framing involves other information that affects the outcome of a risky decision. The right prefrontal cortex has been shown to take a more global perspective[29] while greater left prefrontal activity relates to local or focal processing.[30]
From the Theory of Leaky Modules[31] McElroy and Seta proposed that they could predictably alter the framing effect by the selective manipulation of regional prefrontal activity with finger tapping or monaural listening.[32] The result was as expected. Rightward tapping or listening had the effect of narrowing attention such that the frame was ignored. This is a practical way of manipulating regional cortical activation to affect risky decisions, especially because directed tapping or listening is easily done.
Psychology of risk taking
A growing area of research has been to examine various psychological aspects of risk taking. Researchers typically run randomized experiments with a treatment and control group to ascertain the effect of different psychological factors that may be associated with risk taking. Thus, positive and negative feedback about past risk taking can affect future risk taking. In an experiment, people who were led to believe they are very competent at decision making saw more opportunities in a risky choice and took more risks, while those led to believe they were not very competent saw more threats and took fewer risks.[33]Maintenance
The concept of risk-based maintenance is an advanced form of Reliability centered maintenance. In case of chemical industries, apart from probability of failure, consequences of failure is also very important. Therefore, the selection of maintenance policies should be based on risk, instead of reliability. Risk-based maintenance methodology acts as a tool for maintenance planning and decision making to reduce the probability of failure and its consequences. In risk-based maintenance decision making, the maintenance resources can be utilized optimally based on the risk class (high, medium, or low) of equipment or machines, to achieve tolerable risk criteria.[34]Risk assessment and analysis
Main articles: Risk assessment and Operational risk management
Since risk assessment and management is essential in security
management, both are tightly related. Security assessment methodologies
like CRAMM
contain risk assessment modules as an important part of the first steps
of the methodology. On the other hand, risk assessment methodologies
like Mehari evolved to become security assessment methodologies. An ISO standard on risk management (Principles and guidelines on implementation) was published under code ISO 31000 on 13 November 2009.Quantitative analysis
There are many formal methods used to "measure" risk.Often the probability of a negative event is estimated by using the frequency of past similar events. Probabilities for rare failures may be difficult to estimate. This makes risk assessment difficult in hazardous industries, for example nuclear energy, where the frequency of failures is rare, while harmful consequences of failure are severe.
Statistical methods may also require the use of a cost function, which in turn may require the calculation of the cost of loss of a human life. This is a difficult problem. One approach is to ask what people are willing to pay to insure against death[35] or radiological release (e.g. GBq of radio-iodine),[citation needed] but as the answers depend very strongly on the circumstances it is not clear that this approach is effective.
Risk is often measured as the expected value of an undesirable outcome. This combines the probabilities of various possible events and some assessment of the corresponding harm into a single value. See also Expected utility. The simplest case is a binary possibility of Accident or No accident. The associated formula for calculating risk is then:
Situations are sometimes more complex than the simple binary possibility case. In a situation with several possible accidents, total risk is the sum of the risks for each different accident, provided that the outcomes are comparable:
One of the first major uses of this concept was for the planning of the Delta Works in 1953, a flood protection program in the Netherlands, with the aid of the mathematician David van Dantzig.[36] The kind of risk analysis pioneered there has become common today in fields like nuclear power, aerospace and the chemical industry.
In statistical decision theory, the risk function is defined as the expected value of a given loss function as a function of the decision rule used to make decisions in the face of uncertainty.
Fear as intuitive risk assessment
People may rely on their fear and hesitation to keep them out of the most profoundly unknown circumstances. Fear is a response to perceived danger. Risk could be said to be the way we collectively measure and share this "true fear"—a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.The field of behavioral finance focuses on human risk-aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational". Risk in that case is the degree of uncertainty associated with a return on an asset. Recognizing and respecting the irrational influences on human decision making may do much to reduce disasters caused by naive risk assessments that presume rationality but in fact merely fuse many shared biases.
Anxiety, risk and decision making
Fear, anxiety and risk
According to one set of definitions, fear is a fleeting emotion ascribed to a particular object, while anxiety is a trait of fear (this is referring to "trait anxiety", as distinct from how the term "anxiety" is generally used) that lasts longer and is not attributed to a specific stimulus (these particular definitions are not used by all authors cited on this page).[37] Some studies show a link between anxious behavior and risk (the chance that an outcome will have an unfavorable result).[38] Joseph Forgas introduced valence based research where emotions are grouped as either positive or negative (Lerner and Keltner, 2000). Positive emotions, such as happiness, are believed to have more optimistic risk assessments and negative emotions, such as anger, have pessimistic risk assessments. As an emotion with a negative valence, fear, and therefore anxiety, has long been associated with negative risk perceptions. Under the more recent appraisal tendency framework of Jennifer Lerner et al., which refutes Forgas’ notion of valence and promotes the idea that specific emotions have distinctive influences on judgments, fear is still related to pessimistic expectations.[39]Psychologists have demonstrated that increases in anxiety and increases in risk perception are related and people who are habituated to anxiety experience this awareness of risk more intensely than normal individuals.[40] In decision-making, anxiety promotes the use of biases and quick thinking to evaluate risk. This is referred to as affect-as-information according to Clore, 1983. However, the accuracy of these risk perceptions when making choices is not known.[41]
Consequences of anxiety
Experimental studies show that brief surges in anxiety are correlated with surges in general risk perception.[41] Anxiety exists when the presence of threat is perceived (Maner and Schmidt, 2006).[40] As risk perception increases, it stays related to the particular source impacting the mood change as opposed to spreading to unrelated risk factors.[41] This increased awareness of a threat is significantly more emphasized in people who are conditioned to anxiety.[42] For example, anxious individuals who are predisposed to generating reasons for negative results tend to exhibit pessimism.[42] Also, findings suggest that the perception of a lack of control and a lower inclination to participate in risky decision-making (across various behavioral circumstances) is associated with individuals experiencing relatively high levels of trait anxiety.[40] In the previous instance, there is supporting clinical research that links emotional evaluation (of control), the anxiety that is felt and the option of risk avoidance.[40]There are various views presented that anxious/fearful emotions cause people to access involuntary responses and judgments when making decisions that involve risk. Joshua A. Hemmerich et al. probes deeper into anxiety and its impact on choices by exploring "risk-as-feelings" which are quick, automatic, and natural reactions to danger that are based on emotions. This notion is supported by an experiment that engages physicians in a simulated perilous surgical procedure. It was demonstrated that a measurable amount of the participants' anxiety about patient outcomes was related to previous (experimentally created) regret and worry and ultimately caused the physicians to be led by their feelings over any information or guidelines provided during the mock surgery. Additionally, their emotional levels, adjusted along with the simulated patient status, suggest that anxiety level and the respective decision made are correlated with the type of bad outcome that was experienced in the earlier part of the experiment.[43] Similarly, another view of anxiety and decision-making is dispositional anxiety where emotional states, or moods, are cognitive and provide information about future pitfalls and rewards (Maner and Schmidt, 2006). When experiencing anxiety, individuals draw from personal judgments referred to as pessimistic outcome appraisals. These emotions promote biases for risk avoidance and promote risk tolerance in decision-making.[42]
Dread risk
It is common for people to dread some risks but not others: They tend to be very afraid of epidemic diseases, nuclear power plant failures, and plane accidents but are relatively unconcerned about some highly frequent and deadly events, such as traffic crashes, household accidents, and medical errors. One key distinction of dreadful risks seems to be their potential for catastrophic consequences,[44] threatening to kill a large number of people within a short period of time.[45] For example, immediately after the September 11 attacks, many Americans were afraid to fly and took their car instead, a decision that led to a significant increase in the number of fatal crashes in the time period following the 9/11 event compared with the same time period before the attacks.[46][47]Different hypotheses have been proposed to explain why people fear dread risks. First, the psychometric paradigm[44] suggests that high lack of control, high catastrophic potential, and severe consequences account for the increased risk perception and anxiety associated with dread risks. Second, because people estimate the frequency of a risk by recalling instances of its occurrence from their social circle or the media, they may overvalue relatively rare but dramatic risks because of their overpresence and undervalue frequent, less dramatic risks.[47] Third, according to the preparedness hypothesis, people are prone to fear events that have been particularly threatening to survival in human evolutionary history.[48] Given that in most of human evolutionary history people lived in relatively small groups, rarely exceeding 100 people,[49] a dread risk, which kills many people at once, could potentially wipe out one’s whole group. Indeed, research found[50] that people’s fear peaks for risks killing around 100 people but does not increase if larger groups are killed. Fourth, fearing dread risks can be an ecologically rational strategy.[51] Besides killing a large number of people at a single point in time, dread risks reduce the number of children and young adults who would have potentially produced offspring. Accordingly, people are more concerned about risks killing younger, and hence more fertile, groups.[52]
Anxiety and judgmental accuracy
The relationship between higher levels of risk perception and "judgmental accuracy" in anxious individuals remains unclear (Joseph I. Constans, 2001). There is a chance that "judgmental accuracy" is correlated with heightened anxiety. Constans conducted a study to examine how worry propensity (and current mood and trait anxiety) might influence college student’s estimation of their performance on an upcoming exam, and the study found that worry propensity predicted subjective risk bias (errors in their risk assessments), even after variance attributable to current mood and trait anxiety had been removed.[41] Another experiment suggests that trait anxiety is associated with pessimistic risk appraisals (heightened perceptions of the probability and degree of suffering associated with a negative experience), while controlling for depression.[40]Risk in auditing
The audit risk model expresses the risk of an auditor providing an inappropriate opinion of a commercial entity's financial statements. It can be analytically expressed as:- AR = IR x CR x DR
Risk and uncertainty
In his seminal work Risk, Uncertainty, and Profit, Frank Knight (1921) established the distinction between risk and uncertainty.... Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the phenomena of economic organization, are categorically different. ... The essential fact is that "risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. ... It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitive type.:[53]Thus, Knightian uncertainty is immeasurable, not possible to calculate, while in the Knightian sense risk is measurable.
Another distinction between risk and uncertainty is proposed by Douglas Hubbard:[54][55]
- Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.
- Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example: "There is a 60% chance this market will double in five years"
- Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.
- Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".
Risk attitude, appetite and tolerance
The terms attitude, appetite and tolerance are often used similarly to describe an organization's or individual's attitude towards risk taking. Risk averse, risk neutral and risk seeking are examples of the terms that may be used to describe a risk attitude. Risk tolerance looks at acceptable/unacceptable deviations from what is expected. Risk appetite looks at how much risk one is willing to accept. There can still be deviations that are within a risk appetite. For example, recent research finds that insured individuals are significantly likely to divest from risky asset holdings in response to a decline in health, controlling for variables such as income, age, and out-of-pocket medical expenses.[56]Gambling is a risk-increasing investment, wherein money on hand is risked for a possible large return, but with the possibility of losing it all. Purchasing a lottery ticket is a very risky investment with a high chance of no return and a small chance of a very high return. In contrast, putting money in a bank at a defined rate of interest is a risk-averse action that gives a guaranteed return of a small gain and precludes other investments with possibly higher gain. The possibility of getting no return on an investment is also known as the rate of ruin.
Risk as a vector quantity
Hubbard also argues that defining risk as the product of impact and probability presumes (probably incorrectly) that the decision makers are risk neutral.[55] Only for a risk neutral person is the "certain monetary equivalent" exactly equal to the probability of the loss times the amount of the loss. For example, a risk neutral person would consider 20% chance of winning $1 million exactly equal to $200,000 (or a 20% chance of losing $1 million to be exactly equal to losing $200,000). However, most decision makers are not actually risk neutral and would not consider these equivalent choices. This gave rise to Prospect theory and Cumulative prospect theory. Hubbard proposes instead that risk is a kind of "vector quantity" that does not collapse the probability and magnitude of a risk by presuming anything about the risk tolerance of the decision maker. Risks are simply described as a set or function of possible loss amounts each associated with specific probabilities. How this array is collapsed into a single value cannot be done until the risk tolerance of the decision maker is quantified.Risk can be both negative and positive, but it tends to be the negative side that people focus on. This is because some things can be dangerous, such as putting their own or someone else’s life at risk. Risks concern people as they think that they will have a negative effect on their future.
This is a list of books about risk issues.
Title | Author(s) | Year |
---|---|---|
Acceptable risk | Baruch Fischhoff, Sarah Lichtenstein, Paul Slovic, Steven L. Derby, and Ralph Keeney | 1984 |
Against the Gods: The Remarkable Story of Risk | Peter L. Bernstein | 1996 |
At risk: Natural hazards, people's vulnerability and disasters | Piers Blaikie, Terry Cannon, Ian Davis, and Ben Wisner | 1994 |
Building Safer Communities. Risk Governance, Spatial Planning and Responses to Natural Hazards | Urbano Fra Paleo | 2009 |
Dangerous earth: An introduction to geologic hazards | Barbara W. Murck, Brian J. Skinner, Stephen C. Porter | 1998 |
Disasters and democracy | Rutherford H. Platt | 1999 |
Earth shock: Hurricanes, volcanoes, earthquakes, tornadoes and other forces of nature | W. Andrew Robinson | 1993 |
Human System Response to Disaster: An Inventory of Sociological Findings | Thomas E. Drabek | 1986 |
Judgment under uncertainty: heuristics and biases | Daniel Kahneman, Paul Slovic, and Amos Tversky | 1982 |
Mapping vulnerability: disasters, development, and people | Greg Bankoff, Georg Frerks, and Dorothea Hilhorst | 2004 |
Man and Society in Calamity: The Effects of War, Revolution, Famine, Pestilence upon Human Mind, Behavior, Social Organization and Cultural Life | Pitirim Sorokin | 1942 |
Mitigation of hazardous comets and asteroids | Michael J.S. Belton, Thomas H. Morgan, Nalin H. Samarasinha, Donald K. Yeomans | 2005 |
Natural disaster hotspots: a global risk analysis | Maxx Dilley | 2005 |
Natural hazard mitigation: Recasting disaster policy and planning | David Godschalk, Timothy Beatley, Philip Berke, David Brower, and Edward J. Kaiser | 1999 |
Natural hazards: Earth’s processes as hazards, disasters, and catastrophes | Edward A. Keller, and Robert H. Blodgett | 2006 |
Normal accidents. Living with high-risk technologies | Charles Perrow | 1984 |
Paying the price: The status and role of insurance against natural disasters in the United States | Howard Kunreuther, and Richard J. Roth | 1998 |
Planning for earthquakes: Risks, politics, and policy | Philip R. Berke, and Timothy Beatley | 1992 |
Practical Project Risk Management: The ATOM Methodology | David Hillson and Peter Simon | 2012 |
Reduction and predictability of natural disasters | John B. Rundle, William Klein, Don L. Turcotte | 1996 |
Regions of risk: A geographical introduction to disasters | Kenneth Hewitt | 1997 |
Risk analysis: a quantitative guide | David Vose | 2008 |
Risk: An introduction (ISBN 978-0-415-49089-4) | Bernardus Ale | 2009 |
Risk and culture: An essay on the selection of technical and environmental dangers | Mary Douglas, and Aaron Wildavsky | 1982 |
Socially Responsible Engineering: Justice in Risk Management (ISBN 978-0-471-78707-5) | Daniel A. Vallero, and P. Aarne Vesilind | 2006 |
Swimming with Crocodiles: The Culture of Extreme Drinking | Marjana Martinic and Fiona Measham (eds.) | 2008 |
The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA | Diane Vaughan | 1997 |
The environment as hazard | Ian Burton, Robert Kates, and Gilbert F. White | 1978 |
The social amplification of risk | Nick Pidgeon, Roger E. Kasperson, and Paul Slovic | 2003 |
What is a disaster? New answers to old questions | Ronald W. Perry, and Enrico Quarantelli | 2005 |
Floods: From Risk to Opportunity (IAHS Red Book Series) | Ali Chavoshian, and Kuniyoshi Takeuchi | 2013 |
The Risk Factor: Why Every Organization Needs Big Bets, Bold Characters, and the Occasional Spectacular Failure | Deborah Perry Piscione | 2014 |
See also
- Ambiguity aversion
- Benefit shortfall
- Civil defense
- Countermeasure
- Early case assessment
- Event chain methodology
- Fuel price risk management
- Global Risk Forum GRF Davos
- Hazard (risk)
- Identity resolution
- Information assurance
- Inherent risk
- Inherent risk (accounting)
- International Risk Governance Council
- ISO/PAS 28000
- Life-critical system
- Loss aversion
- Preventive maintenance
- Probabilistic risk assessment
- Reputational risk
- Reliability engineering
- Risk analysis
- Risk compensation
- Risk management
- Risk-neutral measure
- Risk register
- Sampling risk
- Vulnerability
This page was last modified on 3 July 2016, at 15:21.
https://en.wikipedia.org/wiki/Risk
From Wikipedia, the free encyclopedia
From Wikipedia, the free encyclopedia
Risk is the potential
of losing something of value. Values (such as physical health, social status,
emotional well being or financial wealth) can be gained or lost when taking
risk resulting from a given action, activity and/or inaction, foreseen or
unforeseen. Risk can also be defined as the intentional interaction with
uncertainty. Risk perception is the subjective judgment people make about the
severity and/or probability of a risk, and may vary person to person. Any human
endeavor carries some risk, but some are much riskier than others [1].
Definitions
The Oxford English
Dictionary cites the earliest use of the word in English (in the spelling of
risque from its Arabic original "رزق"
) which mean working to gain income gain and profit ( see Wikipedia Arabic
meaning) as of 1621, and the spelling as risk from 1655. It defines risk as:
(Exposure to) the possibility of loss,
injury, or other adverse or unwelcome circumstance; a chance or situation
involving such a possibility.[2]
# Risk is an uncertain
event or condition that, if it occurs, has an effect on at least one [project]
objective. (This definition, using project terminology, is easily made
universal by removing references to projects). [3]
The probability of something happening
multiplied by the resulting cost or benefit if it does. (This concept is more
properly known as the 'Expectation Value' or 'Risk Factor' and is used to
compare levels of risk)
The
probability or threat of quantifiable damage, injury, liability, loss, or any
other negative occurrence that is caused by external or internal
vulnerabilities, and that may be avoided through preemptive action.
Finance: The possibility that an actual
return on an investment will be lower than the expected return.
Insurance: A situation where the
probability of a variable (such as burning down of a building) is known but
when a mode of occurrence or the actual value of the occurrence (whether the
fire will occur at a particular property) is not.(Reference needed) A risk is
not an uncertainty (where neither the probability nor the mode of occurrence is
known), a peril (cause of loss), or a hazard (something that makes the
occurrence of a peril more likely or more severe).
Securities trading: The probability of a
loss or drop in value. Trading risk is divided into two general categories: (1)
Systematic risk affects all securities in the same class and is linked to the
overall capital-market system and therefore cannot be eliminated by
diversification. Also called market risk. (2) Non-systematic risk is any risk
that isn't market-related. Also called non-market risk, extra-market risk or
diversifiable risk.
Workplace: Product of the consequence and
probability of a hazardous event or phenomenon. For example, the risk of
developing cancer is estimated as the incremental probability of developing
cancer over a lifetime as a result of exposure to potential carcinogens
(cancer-causing substances).
International
Organization for Standardization
The ISO 31000 (2009) /
ISO Guide 73:2002 definition of risk is the 'effect of uncertainty on
objectives'. In this definition, uncertainties include events (which may or may
not happen) and uncertainties caused by ambiguity or a lack of information. It
also includes both negative and positive impacts on objectives. Many
definitions of risk exist in common usage, however this definition was
developed by an international committee representing over 30 countries and is
based on the input of several thousand subject matter experts.
Other
Very different
approaches to risk management are taken in different fields, e.g. "Risk is
the unwanted subset of a set of uncertain outcomes" (Cornelius Keating).
Risk can be seen as relating to the
probability of uncertain future events.[4] For example, according to factor
analysis of information risk, risk is:[4] the probable frequency and probable
magnitude of future loss. In computer science this definition is used by The
Open Group.[5]
OHSAS (Occupational Health & Safety
Advisory Services) defines risk as the combination of the probability of a
hazard resulting in an adverse event, times the severity of the event.[6]
In information security risk is defined as
"the potential that a given threat will exploit vulnerabilities of an
asset or group of assets and thereby cause harm to the organization".[7]
Financial risk is often defined as the
unpredictable variability or volatility of returns, and this would include both
potential better-than-expected and worse-than-expected returns. References to
negative risk below should be read as also applying to positive impacts or
opportunity (e.g. for "loss" read "loss or gain") unless
the context precludes this interpretation.
The related terms "threat" and
"hazard" are often used to mean something that could cause harm.
Practice areas
Risk is ubiquitous in
all areas of life and risk management is something that we all must do, whether
we are managing a major organization or simply crossing the road. When
describing risk however, it is convenient to consider that risk practitioners
operate in some specific practice areas.
Economic risk
Economic risks can be
manifested in lower incomes or higher expenditures than expected. The causes
can be many, for instance, the hike in the price for raw materials, the lapsing
of deadlines for construction of a new operating facility, disruptions in a
production process, emergence of a serious competitor on the market, the loss
of key personnel, the change of a political regime, or natural disasters.[8]
Health
Risks in personal
health may be reduced by primary prevention actions that decrease early causes
of illness or by secondary prevention actions after a person has clearly measured
clinical signs or symptoms recognized as risk factors. Tertiary prevention
reduces the negative impact of an already established disease by restoring
function and reducing disease-related complications. Ethical medical practice
requires careful discussion of risk factors with individual patients to obtain
informed consent for secondary and tertiary prevention efforts, whereas public
health efforts in primary prevention require education of the entire population
at risk. In each case, careful communication about risk factors, likely
outcomes and certainty must distinguish between causal events that must be
decreased and associated events that may be merely consequences rather than
causes.
In epidemiology, the
lifetime risk of an effect is the cumulative incidence, also called incidence
proportion over an entire lifetime.[9]
Health, safety, and
environment
Health, safety, and
environment (HSE) are separate practice areas; however, they are often linked.
The reason for this is typically to do with organizational management
structures; however, there are strong links among these disciplines. One of the
strongest links between these is that a single risk event may have impacts in
all three areas, albeit over differing timescales. For example, the uncontrolled
release of radiation or a toxic chemical may have immediate short-term safety
consequences, more protracted health impacts, and much longer-term
environmental impacts. Events such as Chernobyl, for example, caused immediate
deaths, and in the longer term, deaths from cancers, and left a lasting
environmental impact leading to birth defects, impacts on wildlife, etc.
Over time, a form of
risk analysis called environmental risk analysis has developed. Environmental
risk analysis is a field of study that attempts to understand events and
activities that bring risk to human health or the environment.[10]
Information technology
and information security
Main article: IT risk
Information technology
risk, or IT risk, IT-related risk, is a risk related to information technology.
This relatively new term due to an increasing awareness that information
security is simply one facet of a multitude of risks that are relevant to IT
and the real world processes it supports.
The increasing
dependencies of modern society on information and computers networks (both in
private and public sectors, including military)[11][12][13] has led to a new
terms like IT risk and Cyberwarfare.
Main articles:
Information assurance and Information security
Information security
means protecting information and information systems from unauthorized access,
use, disclosure, disruption, modification, perusal, inspection, recording or
destruction.[14] Information security grew out of practices and procedures of
computer security. Information security
has grown to information assurance (IA) i.e. is the practice of managing risks
related to the use, processing, storage, and transmission of information or
data and the systems and processes used for those purposes. While focused
dominantly on information in digital form, the full range of IA encompasses not
only digital but also analog or physical form. Information assurance
is interdisciplinary and draws from multiple fields, including accounting,
fraud examination, forensic science, management science, systems engineering,
security engineering, and criminology, in addition to computer science.
So, IT risk is narrowly
focused on computer security, while information security extends on risks
related to other forms of information (paper, microfilm). Information assurance
risks include the ones related to the consistency of the business information
stored in IT systems and the one stored on other means and the relevant
business consequences.
Insurance
Insurance is a risk
treatment option which involves risk sharing. It can be considered as a form of
contingent capital and is akin to purchasing an option in which the buyer pays
a small premium to be protected from a potential large loss.
Insurance risk is often
taken by insurance companies, who then bear a pool of risks including market
risk, credit risk, operational risk, interest rate risk, mortality risk,
longevity risks, etc.[15]
Business and management
Means of assessing risk
vary widely between professions. Indeed, they may define these professions; for
example, a doctor manages medical risk, while a civil engineer manages risk of
structural failure. A professional code of ethics is usually focused on risk
assessment and mitigation (by the professional on behalf of client, public,
society or life in general).
In the workplace,
incidental and inherent risks exist. Incidental risks are those that occur
naturally in the business but are not part of the core of the business.
Inherent risks have a negative effect on the operating profit of the business.
In human services
The experience of many
people who rely on human services for support is that 'risk' is often used as a
reason to prevent them from gaining further independence or fully accessing the
community, and that these services are often unnecessarily risk averse.[16]
"People's autonomy used to be compromised by institution walls, now it's
too often our risk management practices" John O'Brien.[17] Michael Fischer
and Ewan Ferlie (2013) find that contradictions between formal risk controls
and the role of subjective factors in human services (such as the role of
emotions and ideology) can undermine service values, so producing tensions and
even intractable and 'heated' conflict.[18]
High reliability
organizations (HROs)
A high reliability
organization (HRO) is an organization that has succeeded in avoiding
catastrophes in an environment where normal accidents can be expected due to
risk factors and complexity. Most studies of HROs involve areas such as nuclear
aircraft carriers, air traffic control, aerospace and nuclear power stations.
Organizations such as these share in common the ability to consistently operate
safely in complex, interconnected environments where a single failure in one
component could lead to catastrophe. Essentially, they are organizations which
appear to operate 'in spite' of an enormous range of risks.
Some of these
industries manage risk in a highly quantified and enumerated way. These include
the nuclear power and aircraft industries, where the possible failure of a
complex series of engineered systems could result in highly undesirable
outcomes. The usual measure of risk for a class of events is then: R =
probability of the event × the severity of the consequence.
The total risk is then
the sum of the individual class-risks; see below.[citation needed]
In the nuclear
industry, consequence is often measured in terms of off-site radiological
release, and this is often banded into five or six decade-wide
bands.[clarification needed]
The risks are evaluated
using fault tree/event tree techniques (see safety engineering). Where these
risks are low, they are normally considered to be "broadly
acceptable". A higher level of risk (typically up to 10 to 100 times what
is considered broadly acceptable) has to be justified against the costs of
reducing it further and the possible benefits that make it tolerable—these
risks are described as "Tolerable if ALARP". Risks beyond this level
are classified as "intolerable".
The level of risk
deemed broadly acceptable has been considered by regulatory bodies in various
countries—an early attempt by UK government regulator and academic F. R. Farmer
used the example of hill-walking and similar activities, which have definable
risks that people appear to find acceptable. This resulted in the so-called
Farmer Curve of acceptable probability of an event versus its consequence.
The technique as a
whole is usually referred to as probabilistic risk assessment (PRA) (or
probabilistic safety assessment, PSA). See WASH-1400 for an example of this
approach.
Finance
Main article: Financial
risk
In finance, risk is the
chance that the return achieved on an investment will be different from that
expected, and also takes into account the size of the difference. This includes
the possibility of losing some or all of the original investment. In a view
advocated by Damodaran, risk includes not only "downside risk" but
also "upside risk" (returns that exceed expectations).[19] Some
regard the standard deviation of the historical returns or average returns of a
specific investment as providing some historical measure of risk; see modern
portfolio theory. Financial risk may be market-dependent, determined by
numerous market factors, or operational, resulting from fraudulent behavior
(e.g. Bernard Madoff). Recent studies suggest that endocrine levels may play a
role in risk-taking in financial decision-making.[20][21]
A fundamental idea in
finance is the relationship between risk and return (see modern portfolio
theory). The greater the potential return one might seek, the greater the risk
that one generally assumes. A free market reflects this principle in the
pricing of an instrument: strong demand for a safer instrument drives its price
higher (and its return correspondingly lower) while weak demand for a riskier instrument
drives its price lower (and its potential return thereby higher). For example,
a US Treasury bond is considered to be one of the safest investments. In
comparison to an investment or speculative grade corporate bond, US Treasury
notes and bonds yield lower rates of return. The reason for this is that a
corporation is more likely to default on debt than the U.S. government. Because
the risk of investing in a corporate bond is higher, investors are offered a
correspondingly higher rate of return.
A popular risk
measurement is value-at-risk (VaR). There are different types of VaR: long term
VaR, marginal VaR, factor VaR and shock VaR. The latter is used in measuring
risk during the extreme market stress conditions.
In finance, risk has no
single definition. In particular, it is not always obvious if financial
instruments are "hedging" (purchasing/selling a financial instrument
specifically to reduce or cancel out the risk in another investment) or
"speculation" (increasing measurable risk and exposing the investor
to catastrophic loss in pursuit of very high windfalls that increase expected
value). Some people may be "risk seeking", i.e. their utility
function's second derivative is positive. Such an individual willingly pays a
premium to assume risk (e.g. buys a lottery ticket). In financial markets, one
may need to measure credit risk, information timing and source risk,
probability model risk, and legal risk if there are regulatory or civil actions
taken as a result of "investor's regret". Knowing one's risk appetite
in conjunction with one's financial well-being are important.
Security
AT YOUR OWN RISK
Popular labeling
Security risk
management involves protection of assets from harm caused by deliberate acts. A
more detailed definition is: "A security risk is any event that could
result in the compromise of organizational assets i.e. the unauthorized use,
loss, damage, disclosure or modification of organizational assets for the
profit, personal interest or political interests of individuals, groups or
other entities constitutes a compromise of the asset, and includes the risk of
harm to people. Compromise of organizational assets may adversely affect the
enterprise, its business units and their clients. As such, consideration of
security risk is a vital component of risk management." [22]
Human factors
Main articles: Decision
theory and Prospect theory
One of the growing
areas of focus in risk management is the field of human factors where
behavioral and organizational psychology underpin our understanding of risk
based decision making. This field considers questions such as "how do we
make risk based decisions?", "why are we irrationally more scared of
sharks and terrorists than we are of motor vehicles and medications?"
In decision theory,
regret (and anticipation of regret) can play a significant part in
decision-making, distinct from risk aversion (preferring the status quo in case
one becomes worse off).
Framing[23] is a
fundamental problem with all forms of risk assessment. In particular, because
of bounded rationality (our brains get overloaded, so we take mental
shortcuts), the risk of extreme events is discounted because the probability is
too low to evaluate intuitively. As an example, one of the leading causes of
death is road accidents caused by drunk driving – partly because any given
driver frames the problem by largely or totally ignoring the risk of a serious
or fatal accident.
For instance, an
extremely disturbing event (an attack by hijacking, or moral hazards) may be
ignored in analysis despite the fact it has occurred and has a nonzero
probability. Or, an event that everyone agrees is inevitable may be ruled out
of analysis due to greed or an unwillingness to admit that it is believed to be
inevitable. These human tendencies for error and wishful thinking often affect
even the most rigorous applications of the scientific method and are a major
concern of the philosophy of science.
All decision-making
under uncertainty must consider cognitive bias, cultural bias, and notational
bias: No group of people assessing risk is immune to "groupthink":
acceptance of obviously wrong answers simply because it is socially painful to
disagree, where there are conflicts of interest.
Framing involves other
information that affects the outcome of a risky decision. The right prefrontal
cortex has been shown to take a more global perspective[24] while greater left
prefrontal activity relates to local or focal processing[25]
From the Theory of
Leaky Modules[26] McElroy and Seta proposed that they could predictably alter
the framing effect by the selective manipulation of regional prefrontal
activity with finger tapping or monaural listening.[27] The result was as
expected. Rightward tapping or listening had the effect of narrowing attention
such that the frame was ignored. This is a practical way of manipulating
regional cortical activation to affect risky decisions, especially because
directed tapping or listening is easily done.
Risk assessment and
analysis
Main articles: Risk
assessment and Operational risk management
Since risk assessment
and management is essential in security management, both are tightly related.
Security assessment methodologies like CRAMM contain risk assessment modules as
an important part of the first steps of the methodology. On the other hand,
risk assessment methodologies like Mehari evolved to become security assessment
methodologies. An ISO standard on risk management (Principles and guidelines on
implementation) was published under code ISO 31000 on 13 November 2009.
Quantitative analysis
As risk carries so many
different meanings there are many formal methods used to assess or to
"measure" risk. Some of the quantitative definitions of risk are
well-grounded in statistics theory and lead naturally to statistical estimates,
but some are more subjective. For example in many cases a critical factor is
human decision making.
Even when statistical
estimates are available, in many cases risk is associated with rare failures of
some kind, and data may be sparse. Often, the probability of a negative event
is estimated by using the frequency of past similar events or by event tree
methods, but probabilities for rare failures may be difficult to estimate if an
event tree cannot be formulated. This makes risk assessment difficult in hazardous
industries, for example nuclear energy, where the frequency of failures is rare
and harmful consequences of failure are numerous and severe.
Statistical methods may
also require the use of a cost function, which in turn may require the
calculation of the cost of loss of a human life. This is a difficult problem.
One approach is to ask what people are willing to pay to insure against
death[28] or radiological release (e.g. GBq of radio-iodine),[citation needed]
but as the answers depend very strongly on the circumstances it is not clear
that this approach is effective.
The statistical notion
of risk is often modeled as the expected value of an undesirable outcome. This
combines the probabilities of various possible events and some assessment of
the corresponding harm into a single value. See also Expected utility. The
simplest case is a binary possibility of Accident or No accident. The
associated formula for calculating risk is then:
\text{Risk} = (\text{probability of the
accident occurring}) \times (\text{expected loss in case of the accident})
For example, if
performing activity X has a probability of 0.01 of suffering an accident of A,
with a loss of 1000, then total risk is a loss of 10, the product of 0.01 and
1000.
Situations are sometimes
more complex than the simple binary possibility case. In a situation with
several possible accidents, total risk is the sum of the risks for each
different accident, provided that the outcomes are comparable:
\text{Risk} = \sum_\text{For all accidents}
(\text{probability of the accident occurring}) \times (\text{expected loss in
case of the accident})
For example, if
performing activity X has a probability of 0.01 of suffering an accident of A,
with a loss of 1000, and a probability of 0.000001 of suffering an accident of
type B, with a loss of 2,000,000, then total loss expectancy is 12, which is
equal to a loss of 10 from an accident of type A and 2 from an accident of type
B.
One of the first major
uses of this concept was for the planning of the Delta Works in 1953, a flood
protection program in the Netherlands, with the aid of the mathematician David
van Dantzig.[29] The kind of risk analysis pioneered there has become common
today in fields like nuclear power, aerospace and the chemical industry.
In statistical decision
theory, the risk function is defined as the expected value of a given loss
function as a function of the decision rule used to make decisions in the face
of uncertainty.
Fear as intuitive risk
assessment
People may rely on their
fear and hesitation to keep them out of the most profoundly unknown
circumstances. Fear is a response to perceived danger. Risk could be said to be
the way we collectively measure and share this "true fear"—a fusion
of rational doubt, irrational fear, and a set of unquantified biases from our
own experience.
The field of behavioral
finance focuses on human risk-aversion, asymmetric regret, and other ways that
human financial behavior varies from what analysts call "rational".
Risk in that case is the degree of uncertainty associated with a return on an
asset. Recognizing and respecting the irrational influences on human decision
making may do much to reduce disasters caused by naive risk assessments that
presume rationality but in fact merely fuse many shared biases.
Anxiety, risk and
decision making
Fear, anxiety and risk
While fear is a
fleeting emotion ascribed to a particular object, anxiety is a trait of fear
that lasts longer and is not attributed to a specific stimulus.[30] Studies
show a link between anxious behavior and risk, the chance that an outcome will
have an unfavorable result.[31] Joseph Forgas introduced valence based research
where emotions are grouped as either positive or negative (Lerner and Keltner,
2000). Positive emotions, such as happiness, are believed to have more
optimistic risk assessments and negative emotions, such as anger, have
pessimistic risk assessments. As an emotion with a negative valence, fear, and
therefore anxiety, has long been associated with negative risk perceptions.
Under the more recent appraisal tendency framework of Jennifer Lerner et al.,
which refutes Forgas’ notion of valence and promotes the idea that specific
emotions have distinctive influences on judgments, fear is still related to
pessimistic expectations.[32]
Psychologists have
demonstrated that increases in anxiety and increases in risk perception are
related and people who are habituated to anxiety experience this awareness of
risk more intensely than normal individuals.[33] In decision-making, anxiety
promotes the use of biases and quick thinking to evaluate risk. This is
referred to as affect-as-information according to Clore, 1983. However, the
accuracy of these risk perceptions when making choices is not known.[34]
Consequences of anxiety
Experimental studies
show that brief surges in anxiety are correlated with surges in general risk
perception.[34] Anxiety exists when the presence of threat is perceived (Maner
and Schmidt, 2006).[33] As risk perception increases, it stays related to the particular
source impacting the mood change as opposed to spreading to unrelated risk
factors.[34] This increased awareness of a threat is overemphasized in people
who are conditioned to anxiety.[35] For example, anxious individuals who are
predisposed to generating reasons for negative results tend to exhibit
pessimism.[35] Also, findings suggest that the perception of a lack of control
and a lower inclination to participate in risky decision-making (across various
behavioral circumstances) is associated with individuals experiencing
relatively high levels of trait anxiety.[33] In the previous instance, there is
supporting clinical research that links emotional evaluation (of control), the
anxiety that is felt and the option of risk avoidance.[33]
There are various views
presented that anxious emotions cause people to access involuntary responses
and judgments when making decisions that involve risk. Joshua A. Hemmerich et
al. probes deeper into anxiety and its impact on choices by exploring
“risk-as-feelings” which are quick, automatic, and natural reactions to danger
that are based on emotions. This notion is supported by an experiment that
engages physicians in a simulated perilous surgical procedure. It was
demonstrated that the anxiety about patient outcomes was related to previous
regret and worry and ultimately caused the physicians to be led by their
feelings over any information or guidelines provided during the mock surgery.
Additionally, their emotional levels, adjusted along with the simulated patient
status, suggest that anxiety and a respective decision is specific to the type
of bad outcome.[36] Similarly, another view of anxiety and decision-making is
dispositional anxiety where emotional states, or moods, are cognitive and
provide information about future pitfalls and rewards (Maner and Schmidt,
2006). When experiencing anxiety, individuals draw from personal judgments
referred to as pessimistic outcome appraisals. These emotions promote biases
for risk avoidance and promote risk tolerance in decision-making.[35]
Dread risk
It is common for people
to dread some risks but not others: They tend to be very afraid of epidemic
diseases, nuclear power plant failures, and plane accidents but are relatively
unconcerned about some highly frequent and deadly events, such as traffic
crashes, household accidents, and medical errors. One key distinction of
dreadful risks seems to be their potential for catastrophic consequences,[37]
threatening to kill a large number of people within a short period of time.[38]
For example, immediately after the September 11 attacks, many Americans were
afraid to fly and took their car instead, a decision that led to a significant
increase in the number of fatal crashes in the time period following the 9/11
event compared with the same time period before the attacks.[39][40]
Different hypotheses
have been proposed to explain why people fear dread risks. First, the
psychometric paradigm [37] suggests that high lack of control, high
catastrophic potential, and severe consequences account for the increased risk
perception and anxiety associated with dread risks. Second, because people
estimate the frequency of a risk by recalling instances of its occurrence from
their social circle or the media, they may overvalue relatively rare but
dramatic risks because of their overpresence and undervalue frequent, less
dramatic risks.[40] Third, according to the preparedness hypothesis, people are
prone to fear events that have been particularly threatening to survival in
human evolutionary history.[41] Given that in most of human evolutionary
history people lived in relatively small groups, rarely exceeding 100
people,[42] a dread risk, which kills many people at once, could potentially
wipe out one’s whole group. Indeed research found [43] that people’s fear peaks
for risks killing around 100 people but does not increase if larger groups are
killed. Fourth, fearing dread risks can be an ecologically rational strategy.[44]
Besides killing a large number of people at a single point in time, dread risks
reduce the number of children and young adults who would have potentially
produced offspring. Accordingly, people are more concerned about risks killing
younger, and hence more fertile, groups.[45]
Anxiety and judgmental
accuracy
It remains unclear if
higher levels of risk perception in anxious individuals results in decreased
“judgmental accuracy” (Joseph I. Constans, 2001). There is a chance that
“judgmental accuracy” is correlated to heightened anxiety. However, Constans
conducted a study where anxiety (and worry) in college student’s estimation of
their performance on an upcoming exam showed errors in their risk
assessments.[34] Moreover, it is noted that with high levels of anxiety that
are not attributed to anything in particular, the probability and degree of
suffering associated with a negative experience is misjudged.[33]
Risk in auditing
The audit risk model
expresses the risk of an auditor providing an inappropriate opinion of a
commercial entity's financial statements. It can be analytically expressed as:
AR = IR x CR x DR
Where AR is audit risk,
IR is inherent risk, CR is control risk and DR is detection risk.
Risk versus uncertainty
In his seminal work
Risk, Uncertainty, and Profit, Frank Knight (1921) established the distinction
between risk and uncertainty.
... Uncertainty must be taken in a sense
radically distinct from the familiar notion of Risk, from which it has never
been properly separated. The term "risk," as loosely used in everyday
speech and in economic discussion, really covers two things which, functionally
at least, in their causal relations to the phenomena of economic organization,
are categorically different. ... The essential fact is that "risk"
means in some cases a quantity susceptible of measurement, while at other times
it is something distinctly not of this character; and there are far-reaching
and crucial differences in the bearings of the phenomenon depending on which of
the two is really present and operating. ... It will appear that a measurable
uncertainty, or "risk" proper, as we shall use the term, is so far
different from an unmeasurable one that it is not in effect an uncertainty at
all. We ... accordingly restrict the term "uncertainty" to cases of
the non-quantitive type.:[46]
Thus, Knightian
uncertainty is immeasurable, not possible to calculate, while in the Knightian
sense risk is measurable.
Another distinction
between risk and uncertainty is proposed by Douglas Hubbard:[47][48]
Uncertainty: The lack of complete
certainty, that is, the existence of more than one possibility. The
"true" outcome/state/result/value is not known.
Measurement of uncertainty: A set of
probabilities assigned to a set of possibilities. Example: "There is a 60%
chance this market will double in five years"
Risk: A state of uncertainty where some of
the possibilities involve a loss, catastrophe, or other undesirable outcome.
Measurement of risk: A set of possibilities
each with quantified probabilities and quantified losses. Example: "There
is a 40% chance the proposed oil well will be dry with a loss of $12 million in
exploratory drilling costs".
In this sense, one may
have uncertainty without risk but not risk without uncertainty. We can be
uncertain about the winner of a contest, but unless we have some personal stake
in it, we have no risk. If we bet money on the outcome of the contest, then we
have a risk. In both cases there are more than one outcome. The measure of
uncertainty refers only to the probabilities assigned to outcomes, while the
measure of risk requires both probabilities for outcomes and losses quantified
for outcomes.
Risk attitude, appetite
and tolerance
The terms attitude,
appetite and tolerance are often used similarly to describe an organization's
or individual's attitude towards risk taking. Risk averse, risk neutral and
risk seeking are examples of the terms that may be used to describe a risk
attitude. Risk tolerance looks at acceptable/unacceptable deviations from what
is expected. Risk appetite looks at how much risk one is willing to accept.
There can still be deviations that are within a risk appetite. For example,
recent research finds that insured individuals are significantly likely to divest
from risky asset holdings in response to a decline in health, controlling for
variables such as income, age, and out-of-pocket medical expenses.[49]
Gambling is a
risk-increasing investment, wherein money on hand is risked for a possible
large return, but with the possibility of losing it all. Purchasing a lottery
ticket is a very risky investment with a high chance of no return and a small
chance of a very high return. In contrast, putting money in a bank at a defined
rate of interest is a risk-averse action that gives a guaranteed return of a
small gain and precludes other investments with possibly higher gain. The
possibility of getting no return on an investment is also known as the rate of
ruin.
Risk as a vector
quantity
Hubbard also argues
that defining risk as the product of impact and probability presumes (probably
incorrectly) that the decision makers are risk neutral.[48] Only for a risk
neutral person is the "certain monetary equivalent" exactly equal to
the probability of the loss times the amount of the loss. For example, a risk
neutral person would consider 20% chance of winning $1 million exactly equal to
$200,000 (or a 20% chance of losing $1 million to be exactly equal to losing
$200,000). However, most decision makers are not actually risk neutral and
would not consider these equivalent choices. This gave rise to Prospect theory
and Cumulative prospect theory. Hubbard proposes instead that risk is a kind of
"vector quantity" that does not collapse the probability and magnitude
of a risk by presuming anything about the risk tolerance of the decision maker.
Risks are simply described as a set or function of possible loss amounts each
associated with specific probabilities. How this array is collapsed into a
single value cannot be done until the risk tolerance of the decision maker is
quantified.
Risk can be both
negative and positive, but it tends to be the negative side that people focus
on. This is because some things can be dangerous, such as putting their own or
someone else’s life at risk. Risks concern people as they think that they will
have a negative effect on their future.
List of related books
This is a list of books
about risk issues.
Title Author(s) Year
Acceptable risk Baruch Fischhoff, Sarah Lichtenstein, Paul
Slovic, Steven L. Derby, and Ralph Keeney 1984
Against the Gods: The
Remarkable Story of Risk Peter L.
Bernstein 1996
At risk: Natural
hazards, people's vulnerability and disasters Piers
Blaikie, Terry Cannon, Ian Davis, and Ben Wisner 1994
Building Safer Communities.
Risk Governance, Spatial Planning and Responses to Natural Hazards Urbano Fra Paleo 2009
Dangerous earth: An
introduction to geologic hazards Barbara
W. Murck, Brian J. Skinner, Stephen C. Porter 1998
Disasters and democracy
Rutherford H. Platt 1999
Earth shock:
Hurricanes, volcanoes, earthquakes, tornadoes and other forces of nature W. Andrew Robinson 1993
Human System Response
to Disaster: An Inventory of Sociological Findings Thomas E. Drabek 1986
Judgment under
uncertainty: heuristics and biases Daniel
Kahneman, Paul Slovic, and Amos Tversky 1982
Mapping vulnerability:
disasters, development, and people Greg
Bankoff, Georg Frerks, and Dorothea Hilhorst 2004
Man and Society in
Calamity: The Effects of War, Revolution, Famine, Pestilence upon Human Mind,
Behavior, Social Organization and Cultural Life Pitirim Sorokin 1942
Mitigation of hazardous
comets and asteroids Michael J.S.
Belton, Thomas H. Morgan, Nalin H. Samarasinha, Donald K. Yeomans 2005
Natural disaster
hotspots: a global risk analysis Maxx
Dilley 2005
Natural hazard
mitigation: Recasting disaster policy and planning David Godschalk, Timothy Beatley, Philip Berke, David Brower, and
Edward J. Kaiser 1999
Natural hazards:
Earth’s processes as hazards, disasters, and catastrophes Edward A. Keller, and Robert H. Blodgett 2006
Normal accidents.
Living with high-risk technologies Charles
Perrow 1984
Paying the price: The
status and role of insurance against natural disasters in the United States Howard Kunreuther, and Richard J. Roth 1998
Planning for
earthquakes: Risks, politics, and policy Philip
R. Berke, and Timothy Beatley 1992
Reduction and
predictability of natural disasters John
B. Rundle, William Klein, Don L. Turcotte 1996
Regions of risk: A
geographical introduction to disasters Kenneth
Hewitt 1997
Risk analysis: a
quantitative guide David Vose 2008
Risk and culture: An
essay on the selection of technical and environmental dangers Mary Douglas, and Aaron Wildavsky 1982
Socially Responsible
Engineering: Justice in Risk Management (ISBN 978-0-471-78707-5) Daniel A. Vallero, and P. Aarne Vesilind 2006
Swimming with
Crocodiles: The Culture of Extreme Drinking Marjana
Martinic and Fiona Measham (eds.) 2008
The Challenger Launch
Decision: Risky Technology, Culture and Deviance at NASA Diane Vaughan 1997
The environment as
hazard Ian Burton, Robert Kates,
and Gilbert F. White 1978
The social
amplification of risk Nick Pidgeon,
Roger E. Kasperson, and Paul Slovic 2003
What is a disaster? New
answers to old questions Ronald W. Perry,
and Enrico Quarantelli 2005
Floods: From Risk to
Opportunity (IAHS Red Book Series) Ali
Chavoshian, and Kuniyoshi Takeuchi 2013
The Risk Factor: Why
Every Organization Needs Big Bets, Bold Characters, and the Occasional
Spectacular Failure Deborah Perry
Piscione 2014
See also
Ambiguity aversion
Benefit shortfall
Civil defense
Countermeasure
Early case assessment
Event chain methodology
Fuel price risk management
Global Risk Forum GRF Davos
Hazard (risk)
Identity resolution
Information assurance
Inherent risk (accounting)
International Risk Governance Council
ISO/PAS 28000
Life-critical system
Loss aversion
Preventive maintenance
Probabilistic risk assessment
Reputational risk
Reliability engineering
Risk analysis
Risk compensation
Risk management
Risk-neutral measure
Risk register
Sampling risk
Vulnerability
References
Hansson, Sven Ove; Edward N. Zalta, editor
(Spring 2014). "Risk". The Stanford Encyclopedia of Philosophy.
Retrieved 9 May 2014.
Oxford English Dictionary
A Guide to the Project Management Body of
Knowledge (4th Edition) ANSI/PMI 99-001-2008
"An Introduction to Factor Analysis of
Information Risk (FAIR)", Risk Management Insight LLC, November 2006;.
Technical Standard Risk Taxonomy ISBN
1-931624-77-1 Document Number: C081 Published by The Open Group, January 2009.
"Risk is a combination of the
likelihood of an occurrence of a hazardous event or exposure(s) and the
severity of injury or ill health that can be caused by the event or
exposure(s)" (OHSAS 18001:2007).
ISO/IEC 27005:2008.
[1].
Rychetnik L, Hawe P, Waters E, Barratt A,
Frommer M (July 2004). "A glossary for evidence based public health".
J Epidemiol Community Health 58 (7): 538–45. doi:10.1136/jech.2003.011585. PMC
1732833. PMID 15194712.
Gurjar, Bhola Ram; Mohan, Manju (2002).
"Environmental Risk Analysis: Problems and Perspectives in Different
Countries". Risk: Health, Safety & Environment 13: 3. Retrieved 23
March 2013.
Cortada, James W. (2003-12-04). The Digital
Hand: How Computers Changed the Work of American Manufacturing, Transportation,
and Retail Industries. USA: Oxford University Press. p. 512. ISBN
0-19-516588-8.
Cortada, James W. (2005-11-03). The Digital
Hand: Volume II: How Computers Changed the Work of American Financial,
Telecommunications, Media, and Entertainment Industries. USA: Oxford University
Press. ISBN 978-0-19-516587-6.
Cortada, James W. (2007-11-06). The Digital
Hand, Vol 3: How Computers Changed the Work of American Public Sector
Industries. USA: Oxford University Press. p. 496. ISBN 978-0-19-516586-9.
44 U.S.C. § 3542(b)(1).
James M. Carson; Elyas Elyasiani; Iqbal
Mansur(December 2008), "Market Risk, Interest Rate Risk, and
Interdependencies in Insurer Stock Returns: A System-GARCH Model", The
Journal of Risk and Insurance, ISSN 0022-4367, 12/2008, Volume 75, Issue 4, pp.
873–891, doi: 10.1111/j.1539-6975.2008.00289.x
A Positive Approach To Risk Requires Person
Centred Thinking, Neill et al, Tizard Learning Disability Review
http://pierprofessional.metapress.com/content/vr700311x66j0125/
John O'Brien cited in Sanderson, H. Lewis,
J. A Practical Guide to Delivering Personalisation; Person Centred Practice in
Health and Social Care p211
Fischer, Michael Daniel;
Ferlie, Ewan (1 January 2013). "Resisting
hybridisation between modes of clinical risk management: Contradiction,
contest, and the production of intractable conflict". Accounting,
Organizations and Society 38 (1): 30–49. doi:10.1016/j.aos.2012.11.002.
Damodaran, Aswath (2003). Investment
Philosophies: Successful Investment Philosophies and the Greatest Investors Who
Made Them Work. Wiley. p. 15. ISBN 0-471-34503-2.
Sapienza P., Zingales L. and Maestripieri
D. 2009. Gender differences in financial risk aversion and career choices are
affected by testosterone. Proceedings of the National Academy of Sciences.
Apicella C. L. and all. Testosterone and
financial risk preferences. Evolution and Human Behavior. Vol 29. Issue 6.
384–390.abstract.
Julian Talbot and Miles Jakeman Security
Risk Management Body of Knowledge, John Wiley & Sons, 2009.
Amos Tversky / Daniel Kahneman, 1981.
"The Framing of Decisions and the Psychology of Choice."[verification
needed]
Schatz, J., Craft, S., Koby,
M., & DeBaun, M. R. (2004). Asymmetries
in visual-spatial processing following childhood stroke. Neuropsychology, 18,
340–352.
Volberg, G., & Hubner, R. (2004). On
the role of response conflicts and stimulus position for hemispheric
differences in global/local processing: An ERP study. Neuropsychologia, 42,
1805–1813.
Drake, R. A. (2004). Selective potentiation
of proximal processes: Neurobiological mechanisms for spread of activation.
Medical Science Monitor, 10, 231–234.
McElroy, T., & Seta, J. J. (2004). On
the other hand am I rational? Hemisphere activation and the framing effect.
Brain and Cognition, 55, 572–580.
Landsburg, Steven (2003-03-03). "Is
your life worth $10 million?". Everyday Economics (Slate). Retrieved
2008-03-17.
Wired Magazine, Before the levees break,
page 3.
Catherine A. Hartley, Elizabeth A. Phelps,
Anxiety and Decision-Making, Biological Psychiatry, Volume 72, Issue 2, 15 July
2012, pp. 113–118, ISSN 0006-3223, 10.1016/j.biopsych.2011.12.027.
Jon Gertner. What Are We Afraid Of, Money
32.5 (2003): 80.
Jennifer S. Lerner, Dacher
Keltner. Beyond Valence: Toward
A Model of Emotion-Specific Influences on Judgment and Choice. Cognition &
Emotion 14.4 (2000): 473–493.
Jon K. Maner, Norman B. Schmidt, The Role
of Risk Avoidance in Anxiety, Behavior Therapy, Volume 37, Issue 2, June 2006,
pp. 181–189, ISSN 0005-7894, 10.1016/j.beth.2005.11.003.
Joseph I. Constans, Worry propensity and
the perception of risk, Behaviour Research and Therapy, Volume 39, Issue 6,
June 2001, pp. 721–729, ISSN 0005-7967, 10.1016/S0005-7967(00)00037-1.
Jon K. Maner, J. Anthony Richey, Kiara
Cromer, Mike Mallott, Carl W. Lejuez, Thomas E. Joiner, Norman B. Schmidt,
Dispositional anxiety and risk-avoidant decision-making, Personality and
Individual Differences, Volume 42, Issue 4, March 2007, pp. 665–675, ISSN
0191-8869, 10.1016/j.paid.2006.08.016.
Joshua A. Hemmerich, Arthur S. Elstein,
Margaret L. Schwarze, Elizabeth Ghini Moliski, William Dale, Risk as feelings
in the effect of patient outcomes on physicians' future treatment decisions: A
randomized trial and manipulation validation, Social Science & Medicine, Volume
75, Issue 2, July 2012, pp. 367–376, ISSN 0277-9536,
10.1016/j.socscimed.2012.03.020.
Slovic P (1987) Perception of risk. Science
236:280−285.
Gigerenzer G (2004) Dread risk, September
11, and fatal traffic accidents. Psych Sci 15:286−287.
Gaissmaier, W., & Gigerenzer, G.
(2012). 9/11, Act II: A fine-grained analysis of regional variations in traffic
fatalities in the aftermath of the terrorist attacks. Psychological Science,
23, 1449–1454.
Lichtenstein S, Slovic P, Fischhoff B,
Layman M, Combs B (1978) Judged frequency of lethal events. J Exp Psych HLM
4:551–578.
Öhman A, Mineka S (2001) Fears, phobias,
and preparedness: Toward an evolved module of fear and fear learning. Psychol
Rev 108:483–522.
Hill KR, Walker RS, Bozicevic M, Eder J,
Headland T et al. (2011) Co-residence patterns in hunter-gatherer societies
show unique human social structure. Science 331:1286–1289.
Galesic M, Garcia-Retamero, R (2012) The
risks we dread: A social circle account. PLoS ONE 7(4): e32837.
Bodemer, N., Ruggeri, A., & Galesic, M.
(2013). When dread risks are more dreadful than continuous risks: Comparing
cumulative population losses over time. PLoS One, 8, e66544.
Wang XT (1996) Evolutionary hypotheses of
risk-sensitive choice: Age differences and perspective change. Ethol Sociobiol
17:1–15.
Frank Hyneman Knight "Risk,
uncertainty and profit" pg. 19, Hart, Schaffner, and Marx Prize Essays,
no. 31. Boston and New York: Houghton Mifflin. 1921.
Douglas Hubbard "How to Measure
Anything: Finding the Value of Intangibles in Business" pg. 46, John Wiley
& Sons, 2007.
Douglas Hubbard "The Failure of Risk
Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.
Federal Reserve Bank of Chicago, Health and
the Savings of Insured versus Uninsured, Working-Age Households in the U.S.,
November 2009
Bibliography
Referred literature
James Franklin, 2001: The Science of
Conjecture: Evidence and Probability Before Pascal, Baltimore: Johns Hopkins
University Press.
John Handmer and Paul James (2005).
"Trust Us and Be Scared: The Changing Nature of Risk". Global
Society. vol. 21 (no. 1): 119–30.
Niklas Luhmann, 1996: Modern Society
Shocked by its Risks (= University of Hong Kong, Department of Sociology
Occasional Papers 17), Hong Kong, available via HKU Scholars HUB
Books
Historian David A. Moss' book When All Else
Fails explains the U.S. government's historical role as risk manager of last
resort.
Peter L. Bernstein. Against the Gods ISBN
0-471-29563-9. Risk explained and its appreciation by man traced from earliest
times through all the major figures of their ages in mathematical circles.
Rescher, Nicholas (1983). A Philosophical
Introduction to the Theory of Risk Evaluation and Measurement. University Press
of America.
Porteous, Bruce T.; Pradip Tapadar
(December 2005). Economic Capital and Financial Risk Management for Financial
Services Firms and Conglomerates. Palgrave Macmillan. ISBN 1-4039-3608-0.
Tom Kendrick (2003). Identifying and
Managing Project Risk: Essential Tools for Failure-Proofing Your Project.
AMACOM/American Management Association. ISBN 978-0-8144-0761-5.
David Hillson (2007). Practical Project
Risk Management: The Atom Methodology. Management Concepts. ISBN
978-1-56726-202-5.
Kim Heldman (2005). Project Manager's
Spotlight on Risk Management. Jossey-Bass. ISBN 978-0-7821-4411-6.
Dirk Proske (2008). Catalogue of risks –
Natural, Technical, Social and Health Risks. Springer. ISBN 978-3-540-79554-4.
Gardner, Dan, Risk: The Science and
Politics of Fear, Random House, Inc., 2008. ISBN 0-7710-3299-4.
Hopkin, Paul "Fundamentals of Risk
Management 2nd Edition" Kogan-Page (2012) ISBN 978-0-7494-6539-1
Articles and papers
Clark, L., Manes, F., Antoun, N., Sahakian,
B. J., & Robbins, T. W. (2003). "The contributions of lesion
laterality and lesion volume to decision-making impairment following frontal
lobe damage." Neuropsychologia,
41, 1474–1483.
Cokely, E. T., Galesic, M., Schulz, E.,
Ghazal, S., & Garcia-Retamero, R. (2012). Measuring risk literacy: The Berlin Numeracy Test.
Judgment and Decision Making, 7, 25–47.
Drake, R. A. (1985). "Decision making
and risk taking: Neurological manipulation with a proposed consistency
mediation." Contemporary Social Psychology, 11, 149–152.
Drake, R. A. (1985). "Lateral
asymmetry of risky recommendations." Personality and Social Psychology
Bulletin, 11, 409–417.
Gregory, Kent J., Bibbo, Giovanni and
Pattison, John E. (2005), "A Standard Approach to Measurement
Uncertainties for Scientists and Engineers in Medicine", Australasian
Physical and Engineering Sciences in Medicine 28(2):131–139.
Hansson, Sven Ove. (2007).
"Risk", The Stanford Encyclopedia of Philosophy (Summer 2007
Edition), Edward N. Zalta (ed.), forthcoming [2].
Holton, Glyn A. (2004). "Defining
Risk", Financial Analysts Journal, 60 (6), 19–25. A paper exploring the
foundations of risk. (PDF file).
Knight, F. H. (1921) Risk, Uncertainty and
Profit, Chicago: Houghton Mifflin Company. (Cited at: [3], § I.I.26.).
Kruger, Daniel J., Wang, X.T., & Wilke,
Andreas (2007) "Towards the development of an evolutionarily valid
domain-specific risk-taking scale" Evolutionary Psychology (PDF file).
Metzner-Szigeth, A. (2009).
"Contradictory Approaches? – On Realism and Constructivism in the Social
Sciences Research on Risk, Technology and the Environment." Futures, Vol.
41, No. 2, March 2009, pp. 156–170 (fulltext journal: [4]) (free preprint:
[5]).
Miller, L. (1985). "Cognitive risk
taking after frontal or temporal lobectomy I. The synthesis of fragmented
visual information." Neuropsychologia, 23, 359–369.
Miller, L., & Milner, B. (1985).
"Cognitive risk taking after frontal or temporal lobectomy II. The
synthesis of phonemic and semantic information." Neuropsychologia, 23,
371–379.
Neill, M. Allen, J. Woodhead, N. Reid, S.
Irwin, L. Sanderson, H. 2008 "A Positive Approach to Risk Requires Person
Centred Thinking" London, CSIP Personalisation Network, Department of
Health. Available from:
http://networks.csip.org.uk/Personalisation/Topics/Browse/Risk/ [Accessed 21
July 2008].
Wildavsky, Aaron; Wildavsky, Adam (2008).
"Risk and Safety". In David R. Henderson. Concise Encyclopedia of
Economics (2nd ed.). Indianapolis: Library of Economics and Liberty. ISBN
978-0865976658. OCLC 237794267.
This page was last modified on 19 October
2014 at 19:41.
Keine Kommentare:
Kommentar veröffentlichen