Real Non-Patterns and the Automation of Science

Real Non-Patterns and the Automation of Science

Alexander Wilson

In a now famous Wired article from 2008—“The End of Theory”—Chris Anderson predicted that the algorithmic brute force analysis of increasingly large data sets would soon challenge the relevance of theory. By training machine learning models on raw physical data, we can, as Hong Qin recently put it, replace theory with a “black box that can produce accurate predictions without using a traditional theory or law.”  Another example is Udrescu and Tegmark’s “AI Feynman”, explicitly meant to automate the human physicist’s general tool-set of pattern recognition strategies. Indeed, if the goal of science is just to make better and better predictions, theories seem redundant: everything reduces to Bayes’s theorem. Why should science waste time making itself digestible to human minds in the form of causal accounts, equations, visualizations and so forth, when it could just mechanically go from the data to the prediction?

The demise of causal explanation was foreshadowed in the 20th century’s progressive erosion of traditional conceptions of determinism. The kind of causality we have evolved to be familiar with, the classical “billiard ball” causality where a historical account can be given of a one true sequence of events, is represented neither in general relativity, nor at the quantum level of description. Already Newton’s equations were reversible, but general relativity furthermore seemed to imply that causality was underdetermined by spacetime. In quantum mechanics, the violation of the Bell inequality made it ever more unlikely that anything like determinism actually holds at the fundamental level. As Karen Barad points out, the quantum eraser experiments show that the past is never written in stone; it can be erased and rewritten. Judea Pearl’s so-called Do calculus reveals a logical corollary of all this: causality is just the question of what happens when someone does something, intervenes, collapses the probabilities onto a known certainty, immediately directing the graph and introducing an acyclicality into the world. It seems that there can be no fact of the matter about any historical succession of events, which we have perhaps since the dawn of writing assumed to be fundamental. While the great picture of a cosmological timeline from the big bang to heat death still lingers, it has become clear since the 80s that the rules and constraints of quantum computation are the same that act on all quantum systems, including entire universes. Blurring the old boundary between physics and metaphysics, this has prompted some physicists to ditch the causal mode of explanation (initial conditions * laws of motion), in favor of studying the underlying realizable or constructable possibilities offered by the computational view of physics. 

The last century also saw a shift in mathematics, where the axiomatic method was turned inside-out. The old atomistic, set-theoretic view—bags of dots and their one-to-one correspondences—has progressively been replaced with a view adopting much more flexible and promiscuous (rigorously less discriminate) conceptions of identity and equivalence, where objects derive their identities, and logic its semantics, not down in the atoms but, as it were, “upwards” in higher levels of relationality. As noted in Piaget’s assessment, the mid-century structuralisms were still motivated by the concepts of group theory, and thus still reducible to set theory. But the revival of structural thinking today is increasingly informed by category-theoretic intuitions that “look down” on the traditional atomistic conceptions of reality as mere “special cases”, and jettison the idea of there being ultimate particulars in the world, ultimate substances or “relata”. In this vein, James Ladyman and Don Ross exploit Daniel Dennett’s concept of Real Pattern to ontologize the notion of structure, such that the real is whatever allows itself to be losslessly compressed into a more economic structure, generalizing the principles of algorithmic information theory.

This state of affairs warrants that we consider the extent to which the new category-theoretic structuralisms and type-theoretic cosmologies will have responded to the concerns expressed by “post-structuralism” (in particular, Deleuze and Derrida). If the collapse of traditional conceptions of determinism and causation mean that structuralism can no longer be accused of having ignored the structure’s genesis, what has become of the status of notions like incompletion, chaos, difference, and non-pattern, which once motivated its critique? If all that exists is pattern, if the only real is that which we can successfully reduce, diagram, index, name, quantify, qualify, and track to various degrees of practical applicability and generalization, then what must we make of all those regions of the world where our powers of pattern recognition, whether organic or automated, fail? What about those parts of the world we can’t compress, those points of catastrophe, those trans-world encounters, or what Giuseppe Longo has likened to “frictions” with the real? It is now a very old question: is matter that which is given, or that which resists being taken? Is the world just made up of whatever percepts we can distil into the “universality” of concept, or of that which exhibits recalcitrance to abstraction? Is the real not what makes science hard in the first place? 

https://www.youtube.com/watch?v=_rbntD02Iho