The most authoritative US stocks trading signals

  • Home
  • About Us
  • Plans
  • Rolling Out
  • Blog 
    • All Categories
    • Company Research
    • Industrial Research
    • Investor Bookshelf
    • Macro Research
    • Market Memo
  • …  
    • Home
    • About Us
    • Plans
    • Rolling Out
    • Blog 
      • All Categories
      • Company Research
      • Industrial Research
      • Investor Bookshelf
      • Macro Research
      • Market Memo
Contact Us

The most authoritative US stocks trading signals

  • Home
  • About Us
  • Plans
  • Rolling Out
  • Blog 
    • All Categories
    • Company Research
    • Industrial Research
    • Investor Bookshelf
    • Macro Research
    • Market Memo
  • …  
    • Home
    • About Us
    • Plans
    • Rolling Out
    • Blog 
      • All Categories
      • Company Research
      • Industrial Research
      • Investor Bookshelf
      • Macro Research
      • Market Memo
Contact Us

Risk Savvy: How to Make Good Decisions

Financial Literacy office

· Investor Bookshelf

Hello — welcome to Investor’s Bookshelf. Today I’m going to explain Risk Savvy: How to Make Good Decisions, and in about seventeen minutes I’ll walk you through the book’s core content: what risk really is, how to understand risk correctly, how to sense risk, and how to respond to it.

The author of this book is Gerd Gigerenzer, a well-known German psychologist and scholar of management and decision making. His most notable achievements lie in the study of how humans make judgments and decisions; his research on risk and intuition is essentially unrivaled, and most of his writings focus on this area. Risk Savvy: How to Make Good Decisions is one of Gigerenzer’s representative works. Gigerenzer is a master storyteller: in this book he uses a series of stories to show us how to recognize risk correctly, how to communicate about risk effectively, and how to respond to it appropriately.

Now let’s get to the main topic.

Volcanic eruptions, the subprime mortgage crisis, the spread of mad cow disease — crises keep happening in this world, but beyond panic, anxiety, and some hindsight “summaries and precautions” after the crisis has passed, we seem to have few better responses. For example, after the U.S. subprime crisis the remedy proposed was stronger regulation, and after 9/11 the solution was to create the Department of Homeland Security.

Is this tactic of running after risks really effective? The author says these crude approaches are called “paternalism” — like a father treating a child, having experts tell everyone uniformly what to do. And the results of this method are often disappointing. So the author argues that we must take control of our own lives and health and strengthen our personal ability to cope with risk and uncertainty.

So how can we do that? The author offers five steps: understand what risk is; understand why risks cannot be predicted; learn risk communication; learn the two most commonly used ways we cope with risk — defensive choices and rules of thumb (heuristics); and understand the respective application scenarios and limitations of each.

Part One

Let’s begin with the basics and take a look at what risk actually means.

The author divides risk into known risks and unknown risks; the risks we usually talk about are actually known risks, while there are many more unknown risks besides.

A known risk is like a slot machine in a casino: if you play enough times, the probabilities of winning and losing can be calculated; or like a lottery, the chance of winning ten million is set up front and we are simply testing that probability. So the possibility of losing money when you play slot machines or buy lottery tickets is what we call a known risk.

But those risks that do not have fixed probabilities — for example, whom should I marry? Which stock should I buy? Will a big earthquake happen or not? Should I trust my boss, colleague, or business partner? — these are not things you can compute by calculation; these are unknown risks, i.e. genuine uncertainty.

So, can unknown risks be predicted?

To be frank, the author does not say they are absolutely impossible to predict, but the probability of making a successful prediction is extremely low. Many top figures in various fields have made highly unreliable predictions; for example, the founder of Daimler once predicted that the total number of cars in the world could never exceed one million because he thought there would not be enough chauffeurs — a forecast we now see was wildly off.

An even older example: more than a century ago, when Bell invented the telephone, the largest telegraph company in the United States dismissed it as impractical and said such a complex technology could never be adopted on a large scale; a British company at the time went further and joked that they did not need telephones because they already had plenty of postmen to deliver letters.

Why are predictions so unreliable? Because predictions must rest on assumptions, and those assumptions are constrained by the predictor’s contemporary environment; no matter how smart or learned a person is, it is very difficult to make judgments that transcend the limits of their own time and place, which is why so many authoritative predictions later become the butt of jokes.

Therefore, to correctly understand risk we must abandon the pursuit of certainty. If you are hoping to guard against risk by prediction, you should break that habit — the success rate is simply far too low. Since risks are everywhere and almost impossible to predict, how then should we respond to risk? That brings us to a concept: risk communication.

This term is not very common, but the author says it is a skill everyone should master. Risk communication means describing the concept of risk from two professional dimensions.

First: when describing risk, use frequencies as much as possible rather than single-event probabilities.

For example, in weather forecasts we often hear, “There is a 30% chance of precipitation tomorrow.” But think carefully — do you really understand what that sentence means? What exactly does 30% refer to? You may not be able to answer correctly. The true meaning is: weather conditions like tomorrow’s — including temperature, pressure, wind, etc. — have occurred many times in the past, and according to statistics, on days with those conditions it rained 30 times out of 100. So the so-called probability of precipitation here is really a frequency — the frequency of rainy days; 30% means 30 out of 100 similar days had rain. Once we make this clear, our understanding and description of risk become much more accurate.

Second: when describing risk, use absolute risk as much as possible and avoid relative risk.

Let’s illustrate with a story: once the UK Committee on Safety of Medicines issued a warning that a new generation of oral contraceptives doubled the risk of thrombosis in women — in other words, a 100% increase. That 100% figure sounded terrifying, as if taking the pill would mean a 100% chance of disease. The news caused an uproar; many women stopped taking contraceptives, and rates of unintended pregnancy and abortions rose sharply. But what did that 100% actually mean?

First, the comparison was with an older generation of contraceptives: among every 7,000 women taking the older pill, roughly one would develop a thrombosis; a 100% increase in risk does not mean “increased to 100%,” it means the risk doubled compared with the old pill. In other words, among 7,000 women taking the new pill, perhaps two would develop thrombosis — that’s all.

So the panic arose because people mistook a relative risk increase for an absolute risk. When judging risk, always ask for the absolute risk, not the relative risk, because relative-risk figures can be very large and misleading. Presenting absolute risks gives a clearer, more intuitive sense of how large the danger actually is, and helps us avoid being misled.

Remember: prefer frequencies over abstract probabilities, and absolute risks over relative risks — that is what risk communication means: precise understanding and expression of risk. Incidentally, this is also a rule of thumb when reading statistics: many claims that say they “speak with data” are actually telling only half the story and can be highly misleading. When faced with statistical figures, do two things: first, look for the fuller picture — are the numbers absolute or relative? — and second, consider how the provider’s position or agenda might have influenced the data.

Part Two

Above we discussed how to understand risk and how to carry out risk communication; next we will look at how we should make decisions when facing risk, and what a “defensive choice” is.

A defensive choice refers to the tendency, when confronted with risk, to adopt conservative, defensive strategies; this tendency springs from our innate instincts to avoid mistakes and danger and from our herd instinct — on important decisions we often fear being different from others, fear becoming an outcast, and fear isolation.

It is precisely these innate human factors that give rise to defensive decision-making.

For example, in a certain situation option A might actually be the optimal choice, but people will often end up choosing option B because B may not be the best for solving the problem, yet it is the option that best protects the decision-maker.

Take a hospital example: a doctor treating a patient will often practice “defensive medicine,” because the doctor worries that if they do not order lots of tests, prescribe many drugs, or even perform surgery, the patient or the family will think the doctor is neglecting the case — that the doctor is not being proactive — and might even sue.

In fact, the harms of overtreatment can be substantial.

But for the doctor, defensive treatment protects them from the troubles mentioned above.

The serious consequences of overtreatment may not be obvious at first; they often only become apparent over time.

Some statistics say that as many as 2.5 million surgeries in the United States each year are actually unnecessary, and that 93% of doctors to some extent practice defensive medicine.

Defensive decisions can be further divided into passive defense and active defense.

Using the doctor example again, overtreatment is active defense, while passive defense would be to avoid treating high-risk patients as much as possible, to avoid performing high-risk surgeries, or to avoid becoming an obstetrician.

So when facing risk, neither active nor passive defensive decisions are optimal.

What, then, is a more effective rule for coping with risk?

It is actually simple: rely on our intuition and experience.

There was a very popular film called Sully, starring Tom Hanks; the basic plot is this: a passenger plane struck a flock of birds and both engines failed.

At that moment, whether to return to the airport or attempt another landing on land carried great risk and could have had catastrophic consequences, and the film’s protagonist, Captain Sully, chose the risky option of ditching the plane on the river and ultimately survived.

That story is based on a true event and is considered one of aviation’s great feats; the author also mentions this case in the book.

In that urgent situation, how did Captain Sully make his decision?

He did not compute a precise choice from flight speed, wind, altitude, and distance data; instead, he applied a simple heuristic — drawing on decades of accumulated flying experience — and decisively chose to ditch on the water.

Afterwards, many questioned Sully, and computer simulations suggested that, given the data, it would have been possible to return to the departure airport, so ditching in the river seemed unnecessary.

Later, Sully showed — by simulating the human nervous system — that the human brain could not immediately calculate so many precise data points; as a pilot in that moment he had no real alternative but to rely on his robust flying experience.

That reliance on experience was the key to his success.

Smaller examples prove the same point: excellent baseball players typically do not know higher mathematics and do not draw parabolas to calculate a ball’s trajectory, yet that does not stop them from making perfect, intuitive judgments.

They catch extremely difficult, high-arching fly balls not by calculation but by rich athletic experience.

Captain Sully and baseball players are both using heuristics; in the former the experience is mainly built from training, while in the latter it is often a subconscious reaction — that is, intuition.

In fact, intuition is a form of heuristic, and it has three features: first, intuition is faster than conscious deliberation and usually arises immediately; second, we often do not know why we have that feeling; third, the feeling is strong enough to prompt immediate action.

Why are heuristics and intuition good?

Because people’s common sense and knowledge are incomplete, and especially when people possess some knowledge without having organized it consciously, that knowledge can actually confuse their judgment.

Conversely, maintaining a moderate “benign ignorance” and actively using the experience you do have to explore options can significantly increase the probability of choosing correctly.

This “benign ignorance” is not pejorative; it refers to simplicity and simplification.

Einstein said, “Make things as simple as possible, but no simpler,” which gets at the same idea.

Speaking of which, let’s look at another example.

“Don’t put all your eggs in one basket” is a well-known investment maxim — many people have heard it — and it is connected to the work of Nobel laureate Harry Markowitz, who used sophisticated formulas to describe a return-maximizing, risk-minimizing portfolio, the famous mean–variance model, an achievement that helped earn him the Nobel Prize.

Many assume that Markowitz himself invested using equally complex methods, but in retirement he did something very simple: he divided his money into N equal parts and used each part to buy different stocks.

Note: it was simply an equal split — just a division.

In practice, this simple approach produced returns that outperformed many professionally managed portfolios and even outperformed the mean–variance portfolios proposed by specialists.

In stock-picking contests run by reputable research institutions and the media, contestants who invest by this simple, intuition-based equal-allocation often stand out.

The author argues that ordinary investors who follow a simple informed principle — investing only in what they understand and making choices based on intuition — can match the performance of financial experts, top mutual funds, and broad market indices.

This illustrates a simple truth: less can be more — more information is not always better, nor are more choices always better.

But striving for simplicity does not mean that every problem should be handled crudely; simplification is not the same as being simplistic.

For example, the mean–variance model is after all a Nobel-prize-winning contribution, and it is most useful when you need to analyze a large number of parameters based on historical information.

The author gives three criteria for deciding whether to use heuristics and intuition or detailed calculation: first, the higher the uncertainty — in other words, the larger the unknown risk — the more you should rely on intuition; second, the more alternative options there are, the greater the chance that risk calculations will fail, and the more you should use intuition; third, the less historical reference data there is, the more you should rely on intuition.

If conditions do not meet these three criteria, then you should use intuition less.

Returning to the doctor example: how can we get the most correct, best treatment?

The answer is simple: don’t ask the doctor, “What would you advise me to do?”

Instead ask, “If this were your own situation — or your mother’s, your brother’s, or your child’s — what would you do?”

Because a doctor’s own experiential growth and the influence of their working environment give them very sensitive intuition and excellent heuristic-based judgment, asking them to decide from their own personal perspective is often more effective and direct than asking them to draw conclusions for others; it strips away other complex, distracting factors, helps avoid the risk of overtreatment, and yields the best solution.

Summary

All right — that’s the content of the book; let’s summarize.
First, we must recognize that risk is a broad concept that includes known risks and unknown risks; the former can sometimes be predicted, while the latter should not be chased by futile attempts at prediction, because the world does not provide definite answers to every question.
Second, we must learn risk communication, starting from correctly understanding and accurately describing risk; the effective criterion for describing risk is “frequency” rather than abstract single-event probability, and we should pay attention to “absolute risk” rather than “relative risk.”
Then we examined the strategy people most commonly use when facing risk: defensive decision-making, and overall this approach has more drawbacks than advantages.
Finally, the author explores the most efficient way to decide under risk: rules of thumb, explaining the principle by which experience and intuition operate — namely, the principle of simplicity.
In most situations where conditions do not permit or there is no time for elaborate calculation, experience and intuition often become effective tools for combating risk.
Of course, be mindful of the limits of experience and intuition; they are not a cure-all, so we end by listing the situations in which experience and intuition are appropriate and those in which they are not.

*Don’t have time to read full-length business books? We’ve got you covered.

Every day, we distill one powerful book on business, economics, or investing — so you can learn the key ideas, without spending hours flipping pages.

Subscribe
Previous
U.S. Market Recap-Sep25
Next
 Return to site
Cookie Use
We use cookies to improve browsing experience, security, and data collection. By accepting, you agree to the use of cookies for advertising and analytics. You can change your cookie settings at any time. Learn More
Accept all
Settings
Decline All
Cookie Settings
Necessary Cookies
These cookies enable core functionality such as security, network management, and accessibility. These cookies can’t be switched off.
Analytics Cookies
These cookies help us better understand how visitors interact with our website and help us discover errors.
Preferences Cookies
These cookies allow the website to remember choices you've made to provide enhanced functionality and personalization.
Save