About Swans

about Taleb’s “Black Swan”, dealing with the reality of our times and leaving the laboratory

Salvatore Iaconesi
13 min readAug 25, 2018
a computational black swan

I have finally managed to finish reading Nassim Nicholas Talebs book “The Black Swan”.

It had been sitting in my reading list for quite a long time and, for one reason or another, I just culdn’t get to it. Maybe it was one of my “books in Umberto Eco’s library which he didn’t read” (if you read the book, you will get the joke).

But now I have read it, and I’m glad I did.

I do not intend to do a review here. But, instead, I will use it to bring up a series of issues which for me are of fundamental importance to understand how to deal with today’s and tomorrow’s increasingly complex society. You are all invited to participate to this discussion.

First of all, by reading the book, I finally understook why I get sort of upset when people indicate me as an informatics/computing expert.

A couple of nights ago, at a birthday dinner, a close friend of mine — who was introducing me to people I didn’t know — said “he’s an informatics expert”. Immediately, one of those people replied “ah, how wonderful! then you will certainly be able to help me: my smartphone has been acting so strangely, lately”.

You get the point. :)

I am an artist, an entrepreneur, a researcher, a philosopher, a hacker, a robotics engineer and, yes, I also know a few things about information and computing.

Why, of all of these things, did my friend have to introduce me as an “informatics expert”?

Well, reading “The Black Swan” I understood that I shouldn’t be so mad at my friend.

And it also helped me to understand that there is a definite need to reframe these terms (informatics, and computing), because, in today’s and tomorrow’s world, informatics and computing could be only activities that will help us to grasp usable understandings about the complex existential condition of human beings.

Informatics and computing may be the only humanistic disciplines left.

To cope with this fact, we have the absolute need to reframe and restructure what “informatics” and “computing” mean, to reach beyond a merely technical level (“can you fix my phone”), and to merge these disciplines to other ones.

Why do I say this after reading the book?

The book basically deals with trying to understand what happens when the world transforms into our present one.

First, phenomena were mostly “not scalable”: there was physical labour that was time dependent, regular, simple, straightforward: it was generally “easier” to understand causes. Something “big” sometimes happened — and, of course, there was nature to provide radical uncertainty — that could transform things immediately and complelely (a war, a tsunami…), but they were not so frequent, and if they were human-made, had the “non scalable” characteristics (a war in the past is a very different phenomenon than a war now).

Now, the rising complexity of the world, of the social/economic phenomena it contains, and of the impacts that they can have on the world itself, leads to phenomena that are progressively becoming “scalable”: economy, politics, production, culture become more immaterial, we deal with ideas, communication, wealth. These are not simple extensions of the previous phenomena, they are radically new ones, which follow different logics.

Among these logics, are the caapabilities of these phenomena to drastically change their exposure to “scalable” outcomes. These phenomena are, in fact, “scalable” in nature, they have no perceivable limit or constraint.

It is hard to impossible to imagine a person who is 4 meters tall.

On the other hand imagining a person who has a wealth of 50 billion dollars is not so different that imagining a person who is 500, or 5000, or 500000 billions rich.

The presence of these phenomena in our contemporary (and future) world brings forth our increased exposure to Black Swans: unpredictable events which completely disrupt the scenario, “the winner takes all”.

We have these events all around us: financial crises; market crashes; the long tail; platforms like Facebook, Google, Amazon.

We are fundamentally exposed and blind to Black Swans: we are fundamentally blind to what is called the Unknown-Unknown, that which I don’t know that I don’t know.

There is no story, no experience, no expertise which you can use to discover the Unknown-Unknown: we just have to use different tools, methods and strategies.

To deal with these scenarios, we have to accept the fact that there is a distinct separation between two different types of uncertainty and casuality.

Taleb indicates them as Mediocristan and Extremistan.

In Mediocristan, no single event can bring a contribution that is capable of disrupting the whole system: think about the example about people’s height, in which not even a 4 meter tall human being can disrupt the average of every human being’s height.

In Extremistan, instead, the larger part of the contribution to phenomena comes from isolated, exceptional events: market crashes; the fact the TOTAL of stock market movements across 5 years (that is: the sum of ALL of them, day by day), can come from around 10 days (the sum of the movements coming from these about 10 days is more than the sum that happens in the rest of the 5 years); the ways in which ideas spread on the Internet.

The book is a war against statistics and Gaussian curves, as they are substantially inadequate in the description of reality:

“Categorisation always brings a reduction of real complexity. […] Any reduction of the world around us can have explosive consequences, because it eliminates sources of uncertainty and prevents us from understanding how reality is structured”

[Note: I have the Italian translation of the book, edited by Il Saggiatore in 2014]


“history does not crawl, it jumps, going from one dislocation to another, with a few vibrations in between”.

In the book, our accepted ways of dealing with uncertainty are presented as opening up to a number of fallacies:

  • Narrative fallacy, which is our limited capacity to observe sequences of facts without adding an explanation, a relation, or a story, which results in the systemaatic reduction of the dimension of problems (for example restricting the flow of time in one direction), and trying to giving explanations by “looking backwards” (if you see a puddle of water, what is the shape of the ice block that originated it? Of course there are infinite possible shapes, and you just can’t know unless you had seen it to begin with, if there was an ice block in the first place).
  • Silent proof fallacies, which take place when we don’t consier failed attempts in our estimates: want to know “how to become a millionaire in 10 steps”? You just cannot take 100 of millionaires and ask them how they did it, because in this way you will miss all the ones that filed. We don’t look at the cemetery.
  • Ludic fallacy, according to which what happens in games and casinos is inadequate to describe and model real-life uncertainty: there are certain rules and spaces, and you can only move within them. Game theory is inadequate to deal with Black Swans and, thus, with unforseeable, world-changing events.

“Casinos are the only human enterprise in which probabilities are known, gaaussian and calculable.”

In real life, the sources of uncertainty must be discovered and are unknown.

  • Epistemic arrogance, according to which we overestimate what we know and underestimate uncertainty, compressing the range of possible states.

The book contains a wide critique to the education system, which is described as providing knowledge, methods, tools and sensibilities that are unsuitable to deal with contemporary reality.

Taleb criticises the role of experts. Dynamic, live, “moving” things — and, thus, things which bring “the future” into the discusion — generally can have no experts, whose purpose is to adapt reality to theories, an who are only very seldom capable of doing the opposite. The fact that governments, companies, investors, organizations and society at large rely so much on experts is deemed as potentially catastrophic: it is mostly to be able to delegate responsibility (“the expert said it”), and it prevents more interesting logics, methods and considerations to be applied.

Several contributions are brought in:

  • Popper: to predict historical events it’s necessary to foresee technological innovation, which is not possible (otherwise you would already know how to do it).
  • Poincaré: when making predictions on the future, a growing precision on the knowledge of process dynamics which is being modeled is needed, as error rates grow rapidly.
  • Von Hayek: the problem of scientism in predictions and forecast, as the world could want to avoid following scientific formulas.

Governments and enterprises are completely based on these kinds of fallacies and, thus, are exposed to disaster.

Again: the book is a wonderful, important, insightful source of inspiration to understand contemporary times, and to start acting accordingly.

Data is in the air

In all of this, the role of data is stongly highlighted in the book.

Possibly the only way to be able to deal with Extremistan is to have enormous quantities and qualities of data available, and to be able to observe them for long timeframes, from multiple points of view: if Thanksgiving is a Black Swan for the turkey, it is not a Black Swan at all for the butcher.

And this is where I want to jump in.

When I say data, i say *Big* data. Enormous quantities of data, coming out of every human and non-human manifestation in the planet, all the time.

As we have described before, in this sense data and computation, today, really describe the human condition: when everything that we do and use, and all of our expressions, relations, interactions generate data in one way or another, and are subject to some form of computation, data+computation are not technical/technological issues anymore, but cultural, psychological, political ones.

Going forward, we could try to imagine what kind of data, and how to use it.

If we did that, we would arrive to an important node of the discussion: synthesis and statistics.

In a world of dashboards, analytics and statistics of any kind, we might start considering if we need to take a step back, reflect and act in different ways.

The opportunity of big data, in this sense, is that it’s, well, big, and complex, and diverse. Using it to create synthesis and reductions may be good for some issues, but makes absolutely no sense for dealing with the sort of things we are talking about.

Synthesis and reduction, under the form of indices or classifications, are unsuitable to deal with the Unknown-Unknown.

We need myriads of micro-histories. We need to understand how to collect them, visualise them, observe them, think about them in all their enormous complexity and variety.

We need to understand how to observe life.

This brings forth a number of very serious problems. I had wished that the book addressed them in some form, but it didn’t.

The Issues

For me these are among the most pressing issues of these times.

The availability and accessibility of data

Data, today, is an extractive industry.

All aspects of our lives generate data.

Yet this data is not really available to us.

As a matter of fact, in this scenario, any single person is not able to know what/how much data they generate, who uses it, for what purposes. And it is not possible to have a say on the uses we would wish all this data was used for.

And then there is data from the environment, markets, commerce, education, health, administration etc. And operators keep their data closed and unaccessible, even if it us who happened to have generated it.

Even the recently launched European regulation on data (the GDPR), possibly the most advanced in the world, is problematic in this sense, as it merely recognises and accepts the fact that the data industry is an extractive industry, and it merely tells you how your oil well should be built. Even more, it does so by enforcing processual and administrative requirements which large organizations will have no problem in implementing (it’s what they do), while small and informal operators (who cannot afford to have complex processes and administrations) will not be able to withstand: we’re in a paradoxical situation in which, for example, an operator like Facebook will really have no problem in becoming GDPR compliant (just add a bunch of interface elements here and there that you were going to add anyway after the scandals), while an artist wishing to use data to create an artwork, a group of citizens wishing to use data to organize themselves, a designer to create a new service and other smaller or informal groups will really have a hard time doing it.

Even if there are rising efforts and initiatives towards open data, the larger part of data (including open data itself) remains outside the accessibility and usability (and desirability and sensibility) of the larger part of society.

This happens for various reasons, including multiple forms of digital and education divides.

But it also happens for aesthetic, psychological and political reasons: there is no sensibility to data, and on the necessity to use it to achieve a more antifragile society. There isn’t yet a popular aesthetics of data in this sense, the desirability of data and for the implications which it brings about: we are a very frgile society in this domain, as we are easily made the object of datafication, through which our liberties, expressions, interactions and relations are manipulated, but cannot yet really achieve to be the active subjects of data.

Things are changing in this sense, but they are changing in ways that are mostly technical and administrative. There are very few and sparse actions that focus on the aesthetics, psycholgy, anthropology, language, interaction of data in society, to bring new sensibilities that go beyond technical skills.

Which brings us to what I think is the major issue involved in this domain.

Where are the people?

While advocating the need to “abandon theory” (I am simplifying for the sake of brevity here, read the book to understand what it means), and the need to be able to deal with complex reality, the book itself comes from the closed space of the laboratory.

Let me explain.

The book describes in multiple ways the fact that a different sensibility is needed to be able to deal with a world of rising complexity, in which Big Black Swans exist.

If we want to be able to reach a condition in which we are not so fragile, we need to achieve different research culture and tools, different business practices, different ways of government and administration, to rely less on experts and gaussians, and more on different sciences such as network sciences, fractal uncertainty studies, complexity-based practices, all of them big-data driven, and all in a setting in which we are able to be empiric skeptics.

The book addresses everyone. Of course it speaks to philosophers, and mathematicins, and managers, and researchers and policy makers. But it also speaks to the rest of the people, providing hints on how these themes manifest themselves in their daily lives, whether it is about achieving their ambitions and desires, or to reach a degree of economic solidity, or in choosing a job and purpose in their lives, or in understanding, more generally, how to participate to constructing a world which is less fragile and more exposed to positive change.

Here, I have felt a limit of the book.

Because it continue to describe “science” as something that happens in the closedness and separation of laboratories.

And, of course, it is not.

While the book does a great job to provide conceptual and practical tools to confront with complex scenarios, it does not do anything to show how this approach could leave the laboratory and enter society and the open, public space.


We are in perilious times. Times of populism and anti-science. Here, in these times, we run the risk of forgetting that science is part of society. Which does not mean the populist notion of the anti-expert, of the anti-scientist, or that “anyone can be the expert”. It means that science should not be an isolated phenomenon, and that it should be a shared process that involves all of society.

When we say that Science is not democratic — as it happend in recent times, due to populist impacts on science, such as anti-vaccines rallies, or conspiration theories — ,we run a terrible risk. The risk of thinking that saying that science is democratic means to burn books, or to believe that the layperson can assume the role of the medicine researcher, or the one to propose a cure for cancer.

Instead we must take the opportunity to reclaim a different notion, the fact that science IS democratic, meaning that it is positioned in society, among its members, and that even the scientific method is specifically democratic: science is valid until proved wrong through the scientific method, applied by anyone and replicable by anyone.

This, together with the fact that Science is and should be a conversation, happening in society, among all of its members, each in their own roles and attitudes, and is not only a technical issue, but also a political, psychological, social one, which involves the capacity of a society to come together to form visions, aesthetics, values, objectives.

How can we move from predictions and forecasts performed by experts in closed laboratories separated from life, to participatory performances of data-supported, beautiful, empiric, collaborative skepticism which is able to engage all of society?

This would be an enormous question which would deserve reflection and participation: this is what I would really like to explore with Nassim Nicholas Taleb, and with all of you.

and, in the end…

In our little, tiny existence, we have always promoted this vision, for example through the concept of the Third Infoscape and of the Digital Acupuncture, which opens up access to the myriads of micro-histories which we collectively generate as a society to design interventions, up to the concept of Near Future Design, in which data is collected in radically diverse ways to continuously understand and construct simulacra of possible changes, in trasmedia, diegetic ways, and to engage society at large in this process, or trough our research center, Human Ecosytems Relazioni, which sees data and computation as cultural phenomena and which has an entire department dedicated to Data+AI+Arts, and which is promoting a new concept for a neighbourhood school which uses AI and data to investigate the existential condition of human beings.

The only certain thing here is that we despertely need new models which are radically different from the ones we have now.

To be able to construct them we must run away from Human Centered Design, going in the direction of an Ecosystemic Design in which the human being is not at the center of the universe, but, rather, in a network where organizations, institutions, companies, trees, markets, coffee machines, turkeys and everything in the environment become subjects through their capacity to express through data and to have agency through computation.

This is not a technical issue. It is a major cultural, psychological, political one.