Antifragile: Things That Gain from Disorder

From Adaptive Cycle
Jump to: navigation, search

Authors: N.N. Taleb

Publication Year: 2012

Source: https://www.amazon.com/Antifragile-Things-That-Disorder-Incerto/dp/0812979680

Title: Antifragile: Things That Gain from Disorder

Publisher: Random House

Edition: Hardcover

ISBN: 9781400067824

Categories: Organizational Change, Adaptive Cycle


Abstract

In his book following Black Swan, Nicholas Taleb introduces ‘antifragile’ as a new term to describe something which gets stronger when mishandled. Unlike most objects, which can be fragile (glass, certain electronics) or maybe robust (stone, an airplane), but will eventually break, worn out, or otherwise decay. However, more things that you might think of actually get stronger when affected by time, mishandling, trauma, disorder or other stressors. According to Taleb this concept takes no part in our consciousness, while it is a big part of our live: it is “part of our ancestral behavior, our biological apparatus, and a ubiquitous property of every system that has survived” (p. 48). Antifragility is the overreaction and overcompensation to stressors. Overcompensation, or redundancy in a system, is the result of stressors; a safeguard for the next time these stressors occur (i.e. vaccination, muscle growth). The sentence which speaks best to the imagination is “what doesn’t kill you, makes you stronger”.


Critical Reflection

Antifragility in complex systems Typically, biological systems (body, flora, and fauna) are both fragile and antifragile. The inanimate material (non-living objects) is almost always fragile. All complex systems are antifragile, be they economic, social, biological, ecological, etc. Within the human context we can detect various forms of antifragility. Bones get denser when abused, muscles get stronger when stressed, but also language tends to be learned faster when not in an ideal, comforting position (“Russian is learned fastest in a Russian prison”). The reason why we get vaccinated, is because we want our body to know what's out there, and prepare itself for a real attack from some kind of disease. Also within the history of technology we find antifragility. The majority of technological innovations comes from hobbyists, people who have know-how but did not have the scientific backbone to rationalize the choices. It was all a matter of chance, choice, common sense and intuition. Hence, the randomness and volatility of these processes is what caused past, current and, according to Taleb, future innovative successes. It is not hard to imagine the role of antifragility in similar organic systems like politics, economics and business. Often, the harder one tries to control, the more it gets prone to negative black swans, failure and fragility. We can see that in the (cause of) the economic crisis of 2008, the rise and fall of political constructions and companies. Antifragility of information Information is antifragile. The more one tries to harm it, the more it feeds itself. This can be clearly seen when looking at criticism (on books), it will cause attention which will ensure more readers of the particular book. Another example is the banning of a certain film, which will probably cause more people to see it. Also, when trying to better ones reputation, the result will often be that it will wither, while the ones that have no particular need of reputation-boosting will have a growing one. This can be seen in the financial industry, where the banks try to gain (force) trust through all kinds of advertising, mostly causing the opposite. Information has its way of growing when one tries to control it. Antifragility and evolution When we look at evolution, we see an example of the recursive character of antifragility. Where the individual gains by stressors, so does the genetic code of the species. Thereby, the level of antifragility is higher on the evolutionary scale than on individual scale; in order for the species to grow, it has to sacrifice the individual. This sounds stark, but is only an observation. For nature to exist over time, it needs to adapt (as it is impossible to design a species which stands the passage of time). As Taleb said: “Organisms need to die for nature to be antifragile - nature is opportunistic, ruthless, and selfish” (p. 85). Fragilizing the antifragile system Stressors, volatility, randomness and variance are what antifragile systems be as they are. The human body for example, will degenerate when not exposed to these factors. Also, when affected by stressors in a regularly fashion, it will have less impact than when affected in a random order. It is here that Taleb makes the connection with black swan events. In an economic environment where policies minimize stressors and randomness, the black swan will hit the hardest. A system where randomness and variance is accepted, the black swan will be dealt with much more resilience. Medical invention for example, will wither the chances of healing by pure biological means. The human body is designed to heal itself over time, whatever the damage (if not too severe). Using medicines to battle a cold is a form of reducing randomness, making the black swan (the time when the medicine will not work, for example) a severe event: Attacking the antifragile will backfire. This fragilizing of the antifragile system or fragilizating interventionism is apparent in every complex system we can name, and Taleb describes an extensive table of examples, be it in education, literature, politics, ecology, economics, technology, etc. Also, since a few years, opportunists get concerned with even more sophisticated and complicated models to predict the black swans. Taleb stresses here that less is more; the discourse has to move to antifragility! Antifragile systems and the real world According to Taleb, we can use the concept of antifragility to help us navigate our way through the unpredictable world. One cannot say what kind of event (black swan) is more likely to happen than another, but we can state how (anti)fragile an object, structure or system is. A system will be antifragile to the extent that the harm that it experiences from random events is outpaced by the benefit that it experiences from them (the so-called convex systems, the opposite of concave systems, which get more harm than benefit from such events). To get to the degree of antifragility can be apparent, or may take some analysis to find out. However, one can say that a grandmother may react more fragile to temperature changes, and some banks may be more fragile than another in the event of a crisis. Taleb seems like a man who brings his words into practice. The book jumps from subject to subject and is full with examples, explanatory stories and semi-funny jokes. One could argue his book is a good example of randomness and antifragility. We left out numerous examples, but it would be good for everyone to read some of those (which is no punishment).