Complicated is not the same as Complex – and Why this is Important

Complicated is a very different concept from Complex. Yet most of us do not distinguish them. Even more, we try to manage Complex systems with Complicated solutions. And this turns out to be a very huge problem.

A watch: a complicated system - predictable and reliable
A watch: a complicated system – predictable and reliable

A watch is complicated. It is composed of a large number of pieces; yet they are carefully engineered to fit and move together. The system is very reliable (it’s a watch!). Most engineered systems are complicated, yet reliable. The more the components fit seamlessly together, the better the reliability.



A complex system: a representation of the situation in Afghanistan
A complex system: a representation of the situation in Afghanistan

On the contrary, a complex system involves a lot of different components or contributors; they are all interconnected and inter-dependent; but they all follow a different interest, and they make the system unpredictable. The now classical slide describing the situation in Afghanistan to General McCrystal is a classical example of the depiction of a complex system.

Complex systems are unpredictable. They are what happens in real life outside what can be carefully engineered. They are what creates the unforeseen, the adventure.

Because we mix all the time those two concepts we misunderstand a lot of what is happening around us. The way to tackle and repair complicated systems is completely different from how we can influence complex systems. The way these systems fail belongs to different realms. And when a complicated system encounters unpredictable complexity, it is where our engineering capabilities are overwhelmed. It is where our certainties become shaky. It is when catastrophes like Fukushima happen.


Why You Should Fail More Often

Let’s come back to the financial markets as a model of complex system that can be easily measured. What are the strategies that can provide gains with limited risks and significant chance of success?

Investment is more about managing the emotions of failure than it is about being clever
Investment is more about managing the emotions of failure than it is about being clever

Professionals will tell you that the winning strategies (provided there are winning strategies,as we now can doubt from our post on the superiority of randomness) always involve some dose of failure. The trick is to have less failures than wins, and / or smaller failures than wins. As long as you invest more on stocks that have chances of recovering, over time this strategy will provide interesting returns with a relative certainty – the larger returns being generated by the larger or more frequent number of tries.

What is interesting here is that real professionals that deal with the stock market everyday will never speak of ‘sure win’. They know they have to accept a certain amount of failure. They know there will be bad days as there will be good days – only hopefully slightly less frequently. They know they will have to deal with the emotions of failure and not let themselves be driven by them.

In our real life, that complex system, the lesson holds. We can’t succeed without a certain dose of failure. And the more often we try, the higher will be our overall success. We just need to make sure that any failure will not kill us (that we have a cut-loss soon enough) and that our successes have a bigger upside than our failures’ downside.

In complex systems, frequent failure is the key to success. So, when do you start to fail more often?


How Big Data Will not Help our Understanding of Complexity

It is not possible anymore nowadays to open a serious newspaper or a financial investment document without reading about the great prospects of “Big Data”: analyzing the troves of data progressively acquired by organizations when we surf, shop, spend money, etc., to create Value. And companies have been setup which raise money on the markets to exploit data like other exploit mines and oil wells.

Curve errors/ data size
The number of spurious correlations increase with the size of the data. Be careful when you read about discoveries from “Big Data”!

Be careful, says Taleb in his book Antifragile, lots of data also means lots of spurious correlations. The argument is detailed in the book and in this Wired Article “Beware the Big Errors of ‘Big Data’“, from which we reproduce the curve on the right (it is also in the book).

To translate the point in conventional language, the bigger the size of the data and the number of potential variables considered, the higher the probability that bullshit is produced when it comes to the identification of possible trends. From there to consider that all this “Big Data” trend is just a vast hoax, there is a step we won’t take (I don’t invest in this “Big Data” thing unless it is very focused application). However, this potentially shows that a high proportion of the ‘discoveries’ that people do analyzing “Big Data” will be spurious. Be careful next time you read about one of these new ‘discoveries’.

A further challenge: is it really possible to understand complexity through more advanced data analysis? Following Taleb, we can have high doubts about that. In particular because all data analysis tools will never consider anything than conventional statistical approaches, and will never consider those discontinuities and benefits from volatility which makes real life what it is: interesting!


Must Read: “AntiFragile” by Nassim Taleb

The latest book of Nassim Taleb, Antifragile, is an absolute must-read. It puts a lot of common wisdom on its head and provides an interesting picture of the mistakes of the society surrounding us.

Antifragile by TalebThe book is thick and takes some effort to read through, but it’s worthwhile to take the time. Taleb is a very unconventional thinker; his approach to complexity and predictability is absolutely brilliant.

What is Antifragile? According to Taleb it is the property to thrive in situations that are highly uncertain and volatile.

Most of the things we produce are fragile. The Industrial Age civilization, through its manufacturing standardization and search for efficiency, tends to be fragile. And actually it would seem, according to Taleb, that modern civilization is much more fragile than before. Look at how single freak events like 9/11 or other terrorist attempts fundamentally change the life of travelers and how we are ever more deeply intolerant to unexpected events.

Fundamental mistakes of our scientific approaches are denounced in the book, in particular our custom to observe averages when volatility might be an even more influential parameter; and the limits of conventional financial statistical analysis.

In summary, take the time to read AntiFragile. It gives also an insight on how our world might become when we overcome some psychological hurdles from the Industrial Age. Several posts will be posted here that will inspire themselves from the book in the next few weeks.


Analytic Decision-Making Is Worse than Random More Often Than You Think!

Scientific thought and literature sometimes contains disturbing concepts. A number of papers have shown, for example, that on financial markets, random stock picking is better than informed choices made by any “professional” investor.

Decision making: should you not throw dices instead of over-analyzing?
Decision making: should you not throw dices instead of over-analyzing?

Here’s another fun example of such study: how Orlando the cat beat professionals in a contest. The article goes on to explain that it is not an isolated study. And that in fact, those investors which we admire because of the high return on their investment they generate, might just have been lucky (over the short period where we observe them). As the article mentions, Daniel Kahneman, Nobel prize in economics, showed over a sample that “the correlation between those fund managers who were successful in one year and those who were successful in the next year was close to zero (0.01) over eight years“.

Financial markets are the archetype of the complex system. In such a system it seems that random decision-making is better then (excessive) analysis.

When I presented this idea to a room full of engineers a few weeks ago they probably took me for a fool. Yet as the complexity of our world increases, it is not certain that the best way to take the right decisions is to enhance further our analytic models, which will be less and less representative of reality.

Have some dices in your pockets. In complex situations, throwing them might be the best decision-making method!