Fooled by Randomness – Nassim Taleb

Fooled by Randomness – Nassim Nicholas Taleb

The most important learning from this book: Nobody accepts randomness in their own success, only their failure … and the majority of outcomes are completely random. People overvalue their knowledge and underestimate the probability of their being wrong or lucky. Taleb does not mean success is only a product of luck, he simply displays how luck and randomness have a far greater impact than anyone takes into account.

That which came with the help of luck could be taken away by luck

That things that come with little help from luck are more resistant to randomness.

Working hard will generally result in drowning in randomness.

We need to work smart first, because randomness is impossible to completely eliminate. Anything can happen – and it often does given enough time – where one can be out of a job or lose all their money because of an unforeseen event, i.e. the Black Swan.

Behavioral scientists believe people become leaders not from the skills they possess, but from the superficial impression they make on others through physical signals: i.e. Charisma.

When choosing a profession, we should consider the average performance of those who enter it.

Warren Buffett has an explanation for why he is not worried about competing against HFT or activist investors, “They are playing a different game.” Similarly, Tony Hsieh, CEO of Zappos, believes deciding what table you sit at is the most important business decision one can make.

The idea of taking into account both the observed and unobserved possible outcomes sounds like lunacy. For most people, probability is about what may happen in the future, not events in the observed past; an event that has already taken place has 100% probability, i.e. certainty.

Performance is to be judged by the costs of the alternative, not of the results.

Producing a positive outcome is not a 100% causality of the inputted decisions. A desired outcome is not guaranteed. When it is achieved, judgement should be on what else could have happened, for the unknown is the present worst-case scenario. These are called ‘alternative histories.’

Denigration example:

Gamblers, investors, and decision-makers feel that the sorts of things that happen to others would not necessarily happen to them.

Scientific degrees (specializing in general) is impossible with a desire to specialize.

Without specialty, thorough understanding is unachievable.

MBAs are trained to simplify matters a couple steps beyond their requirement.

What sounds intelligent in a conversation or a meeting, or particularly, in the media, is suspicious.

Any reading of the history of science would show that almost all the smart things that have been proven by science appeared like lunacies at the time they were first discovered

We fail to learn that our emotional reactions to past experiences were short lived.

We benchmark present experiences off our past ones, irrationally thinking the past was more/less impactful than the present currently is. EX: thinking a purchase will provide long-lasting, possibly permanent happiness, or that a setback will cause prolonged distress, even when a previous setback only resulted in a hiccup.

Hindsight bias: Past events will always look less random then they are.

When you look at the past, the past will always be deterministic, since only one single observation took place

People often think that it will surely be the next batch of news that will really make a difference to their understanding of things.

A mistake is not something to be determined after the fact, but in the light of the information until that point.

Those who are very good at predicting the past will think of themselves as good at predicting the future.

Mathematically, progress means that some new information is better than past information, not that the average of new information will supplant past information, which means that it is optimal for someone, when in doubt, to systematically reject the new idea, information, or method.

A positive result does not even out a negative result.

It is estimated the negative effect of an average loss is 2.5x that of a positive one.

Wealth does not count so much into one’s well-being as the route one uses to get it.

Turing test: a computer is said to be intelligent if it can — on average — fool a human into mistaking it for another human. The road from $16M to $1M is not as pleasant as the one from 0 to $1M

The frequency or probability of a loss – in and by itself – is irrelevant.

A loss needs to be judged in connection with the magnitude of the outcome. EX: If there is a 1% chance of surgery failing resulting in death, it should not be judged simply as 1%, but with the magnitude of the outcome, a 1% chance of dying, which carries far more weight than a 1/100 outcome.

Skewness must be considered as well … the small chance of an unexpected event.

The majority of economics and financial guru’s on TV are entertainers, nothing more.

There are certain individuals who are repeatedly in the news citing a “major correction in the stock market” – only to declare their knowledge when an event happens, and ignore their previous remarks when the event fails to occur.

It is not how likely an event is to happen that matters, it is how much is made when it happens that should be the consideration. How frequent the profit is irrelevant; it is the magnitude of the outcome that counts

We do not learn much from shallow, recent history. History teaches us that the things that have never happened before will likely occur sometime in the future.

We read too much into recent history, making statements like, “This has never happened before.” While true, not much has happened in relation to revolution-cycles (agricultural, industrial, technological), that could happen. Taleb notes, the housing market crash never happened before, which is exactly the point, we do not learn from recent history, but we can learn from ancient history, since the housing market has never crashed before there is a high probability it will crash sometime in the future…

Rare events are always unexpected, otherwise they would not occur.

As humans, we act according to our knowledge, which integrates past data with current events and future speculation.

The Black Swan: No amount of observation of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.

Pascal’s Wager: The optimal strategy for humans is to believe in the existence of God. For if God exists, then the believer would be rewarded. If he does not exist, the believer would have nothing to lose.

Present performance almost always guarantees the perception of future performance.

I [Taleb] do not deny that if someone performed better that the crowd in the past, there is a presumption of his ability to do better in the future. But the presumption might be weak, very weak, to the point of being useless in decision making. Why? Because it all depends on two factors: the randomness content of his profession and the number of moneys in operation

Survivorship Bias: We see only winners and get a distorted view of the odds, the fact that luck is most frequently the reason for extreme success, the biological handicap of our inability to understand probability.

We fall for this because we are trained to take advantage of the information that is lying in front of our eyes, ignoring the information that we do not see.

Emotions are overrated.

We do not use our rational brain outside of the classroom and outside of political decisions. Good, enlightened advice does not stick with us for a long time. Self-help books are largely ineffective. To learn, we must work with our human nature, not try to change it. Thinking positive is not a solution.

Optimism is said to be predictive of success. Predictive? Is can also be predictive of failure. Optimistic people certainly take more risks as they are overconfident, about the odds; those who win show up among the rich and famous, others fail and disappear from the analyses, sadly

Adverse Selection Example: Judging an [investment] that comes to you requires more stringent standards than judging an investment you seek. Why would anybody advertise if they didn’t happen to outperform the market? There is a high probability of the investment coming to you if it success is caused entirely by randomness.

Taleb does not know what makes one lucky or unlucky.

“I am unable to answer the question of who’s lucky or unlucky. People frequently misrepresent my opinion. I never said that every rich man is an idiot and every successful person unlucky, only that in absence of much additional information it is preferable to reserve one’s judgement. It is safer.”

Chaos Theory: Concerns itself primarily with functions in which a small input can lead to a disproportionate response.

Anchoring: Comparing to a given difference.

Wealth itself does not really make one happy, but positive changes in wealth may, especially if they come as steady increases.

Availability Heuristic: The practice of estimating the frequency of an event according to the ease with which instances of the event can be recalled.

Representation Heuristic Example: Gauging the probability that a person belongs to a particular social group by assessing how similar the persons’ characteristics are to the ‘typical’ group member’s.

Simulation Heuristic: The ease of mentally undoing an event – playing the alternative scenario.

Affect Heuristic: What emotions are elicited by events determine their probability in your mind.

Two systems of reasoning that activities in the mind fall into:

System 1: effortless, automatic, associative, rapid, parallel process, opaque, emotional, concrete, specific, social, and personalized

We do not think when making choices, instead we use heuristics.

System 2: effortful, controlled, deductive, slow, serial, self-aware, neutral, abstract, sets, asocial, and depersonalized

We make serious probabilistic mistakes in today’s world, whatever the true reason.

Wittgenstein’s Ruler: Unless you have confidence in the ruler’s reliability, if you use a ruler to measure a table you may also be using the table to measure the ruler.

Taleb: “One of the most irritating conversations I’ve had is with people who lecture me on how I should behave. Most of us know pretty well how we should behave. It is the execution that is the problem, not the absence of knowledge.” As Derek Sivers says, “If information was the answer then we’d all be billionaires with perfect abs.

As a skeptic, nothing can be accepted with certainty.

Conclusions through various probabilities should be formed and used as a guide to conduct. Everyday should be viewed as a clean state.

Attribution Bias: A cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others’ behaviors. People constantly make attributions regarding the cause of their own and others’ behaviors; however, attributions do not always accurately reflect reality.

There is substantial discrepancy between the objective record of people’s success in prediction tasks and the sincere belief of these people about the quality of their performance. The attribution bias has another effect: it gives people the illusion of being better at what they do, which explains the findings that 80 – 90% of people think they are above the average (and the median) in many things.

No matter how sophisticated our choices, how good we are at dominating the odds, randomness will have the last word.

Try not to blame others for your outcomes even when they were part of the reason. Do not exhibit self-pity. And do not complain. It is all random…

Note: some of these blocks are directly lifted from the book. I do not claim ownership for all that is written.