COVID-19 Afterworld: An Approximation of Ground Truth

A relook at scientific methodologies

Dr. Wan M Hasni, Chief Data Scientist (Techna Analytics Sdn. Bhd., Kuala Lumpur, Malaysia)https://ibugroup.org
2020-06-01

Table of Contents


COVID-19 afterworld

We coin the term “COVID-19 afterworld” for the lack of a reasonable word to describe the state of human society from the effects of COVID-19 pandemics. We do not want to use the word “aftermath” since we do not know when the aftermath of COVID-19 starts, and we do not know when it will end as the events are still unfolding. Neither we could use the word “post” since COVID-19 events might remain with us for some time before it goes away if it ever does go away. The best we could do is to use the word “afterworld”, because one thing that is becoming clearer that the world we live in is changed permanently due to the COVID-19 pandemics. It is however do have negative connotations because the afterworld generally means the afterlife, or world after death. However, the meaning still has the same parable, that is the world after a major catastrophic event such as COVID-19 pandemics went through a tectonic shift, which has a high certainty of altering everything we were used to doing before the event. To be a bit more precise, we would define “COVID-19 Afterworld” as the “state of the world we live in after COVID-19 had become a global pandemic”. Afterworld, therefore, is about how do we deal with this new state of the world that we live in from here on.

A relook at postmodernity in COVID-19 afterworld

We live in a postmodern world, defined by the accepted notion that human lives are directed and governed by scientific knowledge, which provides the foundations of our life as human beings. As an example, we openly embrace the market economy, based on capitalistic idealisms, driven by economic theories which are postmodern. Similarly in many fields of social science, the tenet that we rely upon in the postmodern era, is human rationality - that humans, at least in the aggregate behave rationally. Based on this notion we built and promotes institutions such as the democratic process, rules of law, and governance structure for the betterment of human society. In healthcare, we accepted the norms that medicine is practiced through drugs, vaccinations, and healthcare systems which are driven by scientific discoveries in biology, chemistry, and physics. In statistics, we accepted that the notion that most of the things are measurable in the statistical sense, which allows us to develop and use various statistical methods to deal with data and numbers that we gather; and from which we perform various inferences and testing. The list can go on almost into any field of knowledge and scientific methods - upon which we derive theories, test them, and provide solutions based on them.1

Is there anything wrong with this notion of postmodernity where science drives everything that we do, and it is the fundamental belief that we rely upon?

For one, we have to admit that the postmodern world is mired with crises, one after another. From economic and financial crises to various tensions in society, such as the growing income gap between the highly rich and downright poor, question of ethics and social justice, the sustainability of the environment, and various gaps that emerge as destabilizing factors in society and humanity. All of these by itself or in combinations, do not necessarily call us to abandon our postmodern beliefs in science, as science itself while the source of our understanding, is also the source of solutions to these newer emerging problems. Adjustments are needed, but total abandonment is not an option in the absence of any viable alternatives.2

Science, despite all of its possible shortcoming, still remain as the most viable and justifiable options to solve human problems. If we take COVID-19 pandemic as a case, while at times it may look hopeless for human societies to put a quick stop to the pandemic, in the face of possible total ruin, the best option is to rely on science to guide us with appropriate actions (such as the social distancing) and to search for the vaccine. Furthermore, with the advancement of digital technologies, we can perform analysis and apply the science of pandemics to provide us guidance for the way forward.

However, despite all of our efforts to use and turn to science every time and whenever or whatever disaster that strikes us, whether financial crisis (of 2008 as an example), natural disasters (tsunami of 2006), epidemics and healthcare crisis (SARS2003, H1N1, MERS) - somehow we always ended up short. For example, we manage to avert a global financial meltdown in 2008/2009, but we ended having large debt overhang that will last for decades. We manage to stop SARS epidemics, but when the COVID-19 pandemics happen, we seem to be unprepared for it and took a long time before effective responses are undertaken (we went through a spate of denial stage). This begs us to ask ourselves, is postmodern human society and science ready for large scale natural disasters, say an earthquake on the Saint Andreas fault in California? Or remote events like a nuclear meltdown? For that matter, are we ready for any major cascading failures, when it happens? In other words, are we ready for Black Swan events on a global scale?

Science should allow us to explain, predict, forecast risks that are forthcoming. But how did science fail to foresee what’s coming is rather disturbing? For example, most risk managers failed the test of risk forecast for the 2008 financial crises - and we are talking about financial institutions of global scale, which employ massive systems to “manage risks” on real-time, and we do have in place governing/oversight institutions such as the Federal Reserve and central banks equipped with modern tools of risk management. Asset bubbles are everywhere, and yet it grows out of proportion without effective market mechanisms to put it on a check until the bubbles become unsustainable and burst with devastating effects. We allow too many corporations to grow and monopolize the market until they become “too big to fail”. We also have democracies, which despite all the checks and balances, increase the divisiveness of society, instead of creating greater cohesion. And finally, why science failed to predict the coming of COVID-19 pandemics, so much so that we fail to stop it while there were still chance to do so, and yet it has grown to a scale that is beyond our comprehension? Do all of these happen because science has failed us, or is it because we become reckless and abandon science? Or is it that science itself had fundamental flaws, which creeps over time, and require a deep relook into our set belief in science?

Here comes the fundamental question which we need to answer: what did go wrong? Science, as our foundation of postmodernity, for it to sustain must also be the basis of reasoning to find the answers. For some, their answer would be to look into religion as the source of guidance. Unfortunately, religion, as they are today, falls too far short in giving convincing answers which could be accepted based on rational principles. Religion, either became too arcane or too dialectic to be the main source of guidance which makes it to be ineffectual. But the same criticism also applied to science, as it too falls short on many fronts to explain the phenomenons that we see.

However, one thing that science has which religion does not, that is in science we have established methods to trace back and relook at various scientific findings and ask hard questions to which we seek honest and sincere answers to. Religion, for some reasons eliminate this possibility due to the rigid and dogmatic stance that had taken place for centuries past. Rejection of religion was to a large degree, caused the scientific revolution to take place and replace religion and defines the postmodern life as we are today. Science still remains the only real option that is available for humanity at large.

What is required for science, is to perform lots of introspection and serious relook at certain fundamentals which might have been mislook or wrongly accepted. This had happened many times in the recent past and will continue to be the way forward. As an example, the Newtonian view of the cosmos was accepted “laws” for at least a few hundred years, before Einstein’s theory of relativity dispels most of these “laws”. Does this mean that Newton was wrong and Einstien is right? Science does not judge it that way, rather it judges that Einstien advance our knowledge much further than Newton, and any subsequent discoveries further advance what Einstein did. Scientific progress had been one of the cornerstone why science still rules the day. And postmodernity is about continuous progress and advancements contributed by science.3

Scientific progress is through the continuous testing of hypothesis, by which test upon test either affirm our beliefs in some of the discoveries, and gradually accept them as the “ground truth”, or replace them with new sets of beliefs. Despite the endless debates about the Humean or Kantian basis of scientific discoveries and philosophies, what’s common is the need for continuous testing of our hypothesis. As Karl Popper concludes that it is the only option that we have for scientific progress despite being the suboptimal solution.4 Crises are stress tests on science, and it provides wake up calls upon humanity to relook at our standing hypothesis again and again. We might get the answer right at times, and wrong at times; despite its potential failures, it still provides guidance to us.

COVID-19 is a severe stress test on science on many fronts. Firstly is on human fragility in the face of virus outbreak - whether it is the healthcare systems, virus management (i.e. drugs and vaccines), our knowledge on diseases and its outbreaks, the readiness of governments and authorities to deal with the outbreaks and pandemics. The next front is on human behaviors - whether our behaviors are sustainable and could continue as they are or it requires massive fundamental changes (touted as the “new norms”). The other major front will be the economic impact - whether any structural changes are required to reduce the impacts on the economic performance of the society. Similarly, we have to ask whether all of these will significantly alter existing structures, such as urbanization and cities as the way of life, the environments, social and political institutions - and all that we might have taken for granted for many past decades. Will the events leads to redefining the supply and value chains of how goods and services are created and delivered or significantly alters the trade balance of countries - are something yet to be seen and understood. In short, COVID-19 as a test requires us to relook almost everything that we are used to, and ask the question of whether some of them will survive in the future.

We do not attempt to answer all of these questions, because it is a very big subject and may take years or even decades before all the dust settled, and by then only, with the benefit of many hindsight, some of the issues will become either self-evident, or irrelevant, or confirmed by scientific rigors of testing and confirmations. Since that is our stand, for the time being, we propose that as a start we should relook into some basics and fundamentals that, in our view was present even before COVID-19, but had not been properly emphasized in the past.5 This what we intend to discuss next.

Debates on scientific discoveries

First we outline Karl Popper’s arguments on the methods of scientific discoveries which are as follows: a) scientific theories should have logical comparisons among themselves; b) there are investigations to determine the character of the theories; c) these theories should survive rigorous tests; and d) theories must be tested by way of empirical applications. The demarcation for any scientific theories is therefore whether it can be falsified through rigors of conclusive verifications.6 In social science, as in other fields of sciences, emphasis must be given to deal with the “problems of simplicity”, that is the task of science is to provide the most universal “laws” which provides the most accurate description of the “ground truth”. However, the dangers of “over-simplifications” must be taken into considerations since it might lead to tautologies and absurdities.7

Kuhn’s disagreement with Popper lies with the view that science is based on “accepted traditions” and Popper’s position on “falsifications” is rather too strong. Instead he views science as “paradigms” rather than hard proofs of falsifications - which renders any real discoveries to be “impossible”.[Kuhn (1962)(Kuhn 1962)] The disagreements between Popper and Kuhn, despite being far apart, is reconciled by Lakatos[Lakatos (1978)(Lakatos 1978)], and explained by later scholars such as Fuller (2003)(Fuller 2003) which in essence neither vindicates Popper nor Kuhn - but rather emphasives on the the possibilities of “how science can go wrong”. What could go wrong is not science itself, rather our unquestionable attitudes towards science. To move forward, as Lakatos puts it: “The clash between Popper and Kuhn is not about a mere technical point in epistemology. It concerns our central intellectual values, and has implications not only for theoretical physics but also for the underdeveloped social sciences and even moral and political philosophy”8. What matters is our attitude towards science, and in particular, in the new sciences such as social science (such as economics, politics, ethics).

Our contention is not on fundamental issues of the debate (of Kuhn vs Popper), rather we focused on one major aspect of science, namely modeling approaches taken by scientists, and in particular in the area of social science and humanity. Our concern is on the usage of probability and statistics, as well as the later variations in forms of big data and machine learning as well as artificial intelligence based solutions. More specifically, we subject goes deeper into our understanding and accepted notions of risk and risk modeling within social science sphere. The arguments goes back all the way to John Maynard Keynes9 and Frank Knight10. A much later argument is presented by Bruno de Finetti11, who argues that “probability” as normally understood “do not exist”. Similar lines of argument is expounded in details by the series of writings by Taleb12.

Since our focus here is not on the debates and philosphical arguments, we would like to summarize our views as follows: Given many shortcoming presented by science in giving strong explanatory answers to many crises and events of the recent past, there is an urgent need to relook at how science approaches these subjects, in particular in the area of human sciences and social sciences, being newly developed field of science.

We present our views in the next section, which are summarized as pointers.

Scientific applications for social science

Applications of social science relied heavily on the assumptions that we made in developing various models - such as equilibrium models in economics, rationality assumptions in social choice theories, as well as many simplified and elegant models of human activities and interactions.

Here we put forth in pointers form, various conditions which many of these assumptions could go wrong, and hence the need of dealing with such problems.

Pointers 1: Natural phenomenons in social sciences

Occurrence in social sciences are natural phenomenons, and investigations on social sciences reveal that they exhibit:

  1. Non-linearity and convexity: Many phenomenons in social science are non-linear. In fact, most exhibit exponential relations with fat-tails and extreme events and do not fit under standard distributional assumptions of “gaussian” probability functions. Furthermore, many phenomenons exhibit a high degree of convexity - which causes over/underestimation of extreme observations.

  2. Non-symmetric: Assymetricities are everywhere, rarely we see observations which are an ergodic and stationary process. Despite these shortcomings, for most cases most testing exercises assume processes to be ergodic, stationary, and well behave extremities. In other words, most testing allows large tolerance for the shortcomings, in order to gain simplicities and tractable solutions.

  3. Scales: Many things do not scale easily in the first degree; however, if they scale, it happens at higher degrees beyond the first degree. And furthermore, for those which scale (at higher degree), they do so either at a sub-linearly or superlinearly level. Rarely linear scale is observed.

  4. Dynamics: The dynamics of scaling happens at varying rates, depending on the fundamental layers involved; the intertemporal nature of the dynamics are extremely hard to be captured due to the varying rates and the speed of interactions between the various elements.

  5. Slow and fast: The fundamental layers which involved human behaviors do not change fast, and in fact, extremely slowly, whilst the dynamics of human relations do change much faster. The structures and the roles of institutions do not change fast, but the actions it took may change faster. The varying degrees of change are important for any analysis to be coherent when the inter-temporal situation is taken into consideration.

  6. Rates of convergence: There are many rates of convergence which are extremely superlinear and causes agglomerations and cascading effects (of success or failures). Furthermore, with the possibility of the existence of multiple optima (which could be sub-optimal), the rate of convergence towards these sub-optimal is highly probable. This happens because of many self-feedback, tipping points, herd mentality, and other irrational behaviors that could exist and sustain within a reasonable period for this type of convergence to happen. And on the other hand, convergence based on sublinear rates suffer from a deficiency in the sense of the law of large numbers - it needs very long periods of observations before any convergence could be observed.

All of the above pointed out that due care must be taken into considerations when we deal with the social world phenomenon, namely we have to be aware of premature conclusions and less than optimal solutions. And when we perform modeling of the social world, we have to take extreme care with regards to wrong assumptions in the model (i.e. model errors) and applications of wrong models (i.e. errors in modeling).

Pointers 2: Social worlds are networks

To deal with the problems of simplicity, theories need to be explained as models, which are simplifications of reality and deduced as testable hypotheses. Simplified models relied on the rigor of mathematical analysis, and explained as empirical applications. Since social worlds could be explained within network settings, we choose our approach to be network driven methods of analysis.13 Here we provide the pointers to the reasons why we choose network theory over others.

  1. Systems and networks: Most of the human relations exist as systems or can be modeled as systems. Government is a system, laws are systems, democracy and its institutions are systems, education, healthcare, businesses are systems. Networks represent the “maps” of systems, and it is at the heart of systems definitions. By applying network theory, any systems can be structured and studied using network models and from it, meanings and understanding could be deduced.14

  2. Complicated world versus complex world: Complicated world is defined as a world with many interrelated components and you can’t remove any component without altering the whole system; and complex world is where its behaviors are dependent on lower-level components, whereby removal of them alters the behaviors of the whole system. The complicated world is reducible, but complex worlds are not. Many network phenomenon, viewed from system perspectives could be understood as layers upon layers (networks upon networks) and layers within layers (networks within networks), and hence inherently exhibits complexity behaviors and at the same time complicated, and therefore not easily reducible without taking a look at the whole.15

  3. Subject to network structures: Since social world networks are natural phenomena, it is then subjected to the same observations mentioned, namely they are mostly non-linear, convex, asymmetric, and scale in some peculiar manner. The dynamics also would be subjected to the same constraints of inter-temporal nature, elements that are slow and fast-changing, and rates of convergence occurs at some sublinearly or superlinearly behaviors.16

  4. Human as agents: Since in social science, the subject matter of interest is about humans, the fundamental layer is human systems, therefore the focus should be on the human network(s) and human network dynamics. This is modeled as humans to be the central agents in human networks. Humans are assumed to be rational and adaptive, hence the human network could be modeled as an agent-based complex adaptive system .17

Pointers 3: Digital age and data science

Mathematically speaking, any network models assumed massive computational possibilities, and in the case of human networks, the computing requirements are even higher. Fortunately with the advancement in computing technologies these are all within our capabilities. Here we explain why:

  1. Digital age: The advancement of the digital era, allow us to collect human “digital footprints”, which were not possible decades ago, but is a clear possibility today (and the future).

  2. Technologies: The development of technologies in particular computing capabilities allow us to develop tools that enable us to process and perform various testing and discoveries which leads us to better understand certain behaviors and aspects of humans on a global scale, in a much timelier manner.

  3. Data science Data, which now becomes available handily, provides us with the basis of forming evidence, upon which many scientific testing and methods could be further developed and enhanced. For this reason, nowadays a new field of science had been developed, called data science.

  4. Inter-domain Given the complexity and complicated nature of various subjects at hand (as explained in Pointers 1,2 and 3), data science shouldn’t and couldn’t function as standalone domain - it must be exercised together in joint works/efforts with other domains, as much as possible - to give its validity as science (otherwise it becomes a pseudoscience).

Pointers 4: Summary

  1. Simplicity rule: We claim that despite all the complexities, complications, dynamics, non-linearities, convexities, higher degrees scaling rules, etc., through data science we could gain reasonable level insights - which many times are simple and could be explained with simple heuristics.

  2. Rigorousness: Being simple allows us to test the hypothesis under question rigorously and openly (i.e. not a black-box approach or untestable) and must be reproducible research to allow others to repeat similar tests and find counterfactuals if they exist.

  3. Transparency and informative: Furthermore, any scientific findings must involve transparency in testing by concluding both, presence of evidence or absence of evidence (Type I and Type II errors in testing), and we can also converse to another side by looking at causalities instead of correlations, namely by looking at evidence of presence and evidence of absence (Type IB and Type IIB errors).

  4. Robustness and pre-cautionary principle: The major caveat that we have to take cognizant of is to always remain on guards with our assumptions, and not to prematurely conclude our findings. And most importantly, to be as least biased as possible in our approach. Even if bias does creep in, we must be transparent to what they are and its possibilities to lead to wrong conclusions. The way to reduce this is by performing simulations and working with bounds rather than point estimates. Whatever to be summarized, must be made in probability statements, rather than statements of certainties, which allows biases to be amplified, instead of being bounded.

Even though epistemically not all issues mentioned above are settled with certainty, the approach taken provides ways forward, instead of being stuck in a dialectic impasse. It is better to have models that could be wrong, rather than no model at all. At least we have some reference benchmarks if they are wrong.

Way forward

COVID-19 pandemics, as we had emphasized, requires lots of new learning to be performed by all levels of the society. The knowledge space that we have acquired about learning from historical events of the past as we know now, is extremely limited because the event of similar nature had never occurred for us to leverage on. Modeling COVID-19, using traditional epidemics model (SIR and its varieties), had proven to be inadequate, as for now, it could only predict the “growth of the epidemics” but failed to capture the dynamics once it had grown to a size and duration as we have now seen. In other words, the models do not “scale”. Versions of dynamic SIR models however, manage to capture the possibility of a prolonged outbreak (i.e. new waves of epidemics), but again failed to provide clear directions about ending it.

Socioeconomically speaking, we begin to see pieces of evidence from economists and social scientists, delving deeper into the COVID-19 consequences on the people and the economics aftermath of the lockdowns, and providing ideas on how to perform the balancing act between the safety of people (from pandemics) against the economic needs (opening back the economic activities). This where the current debates now lie. Assumptions about the future and expectations of the people about it are the central subject of the debates. This ranges from government budgets on post lockdown to how businesses should adjust in the new norms, and how the public and the people should or could react in the various possible scenarios that might take place.

As we have argued in our earlier writings, disease pandemics is a domain whose expertise will be among the health professionals, and it is best for now that the subject is left for them to deal with. In finality, whether they have performed their jobs well or poorly will be judged by the people, and that judgment will come much later after all the dust had been settled - which may take years before it can be made - when all the data and information will finally become available to the public. One thing that we are certain is that in this day and age of the digital world, nothing could be hidden forever. For us, we could only say that the lack of data sharing only hampers other people who could contribute to the learning had been curtailed while the pandemics are taking place, and from here on, history will judge whether the decisions not to be utterly transparent or minimally forthcoming in getting larger and wider participation from the knowledge-driven data scientists will fare eventually.

In the same regard, we share the views of Bill Gates (and other technology leaders), Muhamad El Erian, Paul Krugman, Nouriel Roubini (economists), and Nassim Taleb (probabilist) that in the digital age, we must use data and analytics to the maximum, in this what we called as the new “learning environment”. The only different thing that we have now in pandemics and economic management is we live in the digital age, where we have an abundance of data, and hence despite our inability to used past data as our guide, we do have a massive amount of current and real-time data which will be our best way (and probably the only tool) to deal with the current and future situations - what has been touted as “now-casting” in place of “forecasting”. Now-casting is about using whatever available data and performs scenarios simulations (and lots of them) - and with computing power and data availability on a global scale, this is a real possibility.

Equipped with all the arguments put forth herewith, we are now ready to focus on the subject of COVID-19 afterworld as our main focus of discussions. This will be covered in the next few articles we will publish.

Barabasi, Albert Laszlo. 2016. Network Science. Cambridge, United Kingdom: Cambridge University Press.

Finetti, Bruno de. 2017. Theory of Probability : A Critical Introductory Treatment. New York, USA: Wiley.

Fuller, Steve. 2003. Kuhn Vs Popper : The Struggle for the Soul of Science. London, United Kingdom: Icon Books.

Keynes, John Maynard. 1921. A Treatise on Probability. London, United Kingdom: McMillan & Co.

Knight, Frank H. 1921. Risk, Uncertainty, and Profits. New York, USA: Houghton Mifflin & Co.

Kuhn, Thomas. 1962. The Structure of Scientific Revolutions. Chicago, Illinois, USA: University of Chicago Press.

Lakatos, Imre. 1978. Falsification and the Methodology of Scientific Research Programmes. Cambridge, United Kingdom: Cambridge University Press.

Miller, John H. 2015. A Crude Look at the Whole - the Science of Complex Systems in Business, Life and Society. New York, New York, USA: Basic Books.

Miller, John H., and Scott E. Page. 2007. Complex Adaptive Systems - an Introduction to Computational Models of Social Life. Princeton, New Jersey, USA: Princeton University Press.

Popper, Karl. 1934. The Logic of Scientific Discoveries. London, United Kingdom: Routledge Classics.


  1. Reference: Kuhn (1962)(Kuhn 1962)

  2. This argument is echoed by Kuhn (1962)(Kuhn 1962) and other philosophers such as Popper (1934)(Popper 1934)

  3. Kuhn (1962)(Kuhn 1962)

  4. Popper (1934)(Popper 1934)

  5. Here we follow the argument of Nassim Taleb’s series of works (https://www.fooledbyrandomness.com/)

  6. In summary, scientific theories must be stated in forms testable statements, which are then put under the rigors of tests of falsifications by way of empirical testing via the usage of probability theories and statistical methods. Reference: Popper (1934)(Popper 1934)

  7. Popper (1934)(Popper 1934)

  8. Lakatos (1978)(Lakatos 1978)

  9. Keynes (1921)

  10. Knight (1921)

  11. Finetti (2017)

  12. https://www.fooledbyrandomness.com/

  13. Choosing the network approach is not necessarily constraining the problems of induction, in fact on the contrary it helps us to have many simultaneous views of the scientific questions to be tested. Network theory has the strong appeals of mathematical logic and rigor which predates to Leonhard Euler of the eighteenth-century mathematician

  14. Barabasi (2016)(Barabasi 2016)

  15. Miller and Page (2007)(Miller and Page 2007) and Miller (2015)(Miller 2015)

  16. Barabasi (2016)(Barabasi 2016)

  17. Miller and Page (2007)(Miller and Page 2007) and Miller (2015)(Miller 2015)

Reuse

Text and figures are licensed under Creative Commons Attribution CC BY-SA 4.0. The figures that have been reused from other sources don't fall under this license and can be recognized by a note in their caption: "Figure from ...".