News + Thoughts


Radical Aesthetics (Dis)Embodiment

The idea is to bring people together and discuss radical, speculative or philosophical ideas about aesthetics on a regular basis. The Meetings in Radical Aesthetics are part conference, part research group, yet informal and open to the public, where participants are asked to be somewhat prepared, and arrived poised for discussion and debate. For each meeting, two artists or researchers are paired up and present a talk on a predetermined theme. This meeting will bring together distinguished artists and researchers Lynda Gaudreau and Armando Menicacci around the theme of “(dis)embodiment”.

Politics in the Anthropocene


Alexandre St-Onge and I are very pleased to present the 2nd Meeting in Radical Aesthetics,

with distinguished guests Steven Shaviro and Erik Bordeleau.

Politics Anthropocene

Meetings in Radical Aesthetics

Politics in the Anthropocene
The idea is to bring people together and discuss radical, speculative or philosophical ideas about aesthetics on a regular basis. The Meetings in Radical Aesthetics are part conference, part research group, yet informal and open to the public, where participants are asked to be somewhat prepared, and arrived poised for discussion and debate. For each meeting, two artists or researchers are paired up and present a talk on a predetermined theme. This second meeting will bring together distinguished writers and thinkers Steven Shaviro and Erik Bordeleau, around the theme of “Politics in the Anthropocene”.

Steven Shaviro

Steven Shaviro teaches film studies at Wayne State University, and writes about process philosophy and speculative realism, about music videos, and about science fiction. His latest book is No Speed Limit: Three Essays on Accelerationism.

Érik Bordeleau

Erik Bordeleau is a researcher with the SenseLab and he is the author of Comment sauver le commun du communisme? (Le Quartanier, 2014) et of Foucault anonymat (Le Quartanier, 2012). He is currently working on the mode of presence of ghosts, spirits and other surexistential forces in Taiwanese cinema. He has published several articles on cinema and contemporary thought, and collaborates with journals and magazines such as Le merle, 24 images, Inflexions, ETC., Hors-champ, ESSE, Scapegoat, Espai en blanc, etc. He is part of Épopée, action group in collective cinema collective, who directed Rupture (2014) and Insurgence (2012), a cinematic essay about Québec’s most recent student strike.

Why Process is Extrinsic to the Digital Domain

Alexander Wilson, 2014

The term “process” usually connotes continuity. The canonical heraclitean river, different each time one enters it, presents the essential character of process: it flows. Digital processes, however, are characterized by cuts, breaks, and jumps. The digital is given as a series of discontinuities. We look at the river flowing: it seems continuous, a unified flux. It is highly entropic, meaning it exceeds our capacity to resolve the minuscule details we assume compose it (water molecules). Every time you walk into it though, it feels decisively different. It may be warmer today than yesterday, the current might be stronger or it may have waned.

Process implies change. But change in the digital realm can only happen discretely: one moment we have one state, the next we have another. One moment the water is warm, the next it is cold. There is nothing in between. No process to speak of. In this sense the digital neatly seems to elide change. Nothing has been altered. The state space in the digital domain is finite; so it is as though the various moments in the digital process are given all at once, and for all eternity. Indeed, one might say they are “outside of time”, for these states, in themselves, are not affected by the process; they are exactly the same each time they are taken up. Process hence seems to move the digital from the outside.

The digital is inseparable from processes of “discretization”, as Bernard Stiegler observes. It is simply the process that cuts continuities into discontinuities, making unified, homogeneous mixtures discrete, nameable, mobile, functional. Stiegler usually looks at this as a (pre)historical or anthropological process (ie: grammatization). But following Simondon, it is also possible to interpret this as an ontogenetic process. Simondon envisions the evolution of technology somewhat like thermodynamic process: a crystal propagating through a supersaturated solution, where the unified mixture of molecules is progressively organized, and structured, put to the service of the reproduction of the given symmetries ad infinitum, mechanically and algorithmically, until the favorable circumstances, or preindividual potentials, are exhausted.

The digital is intimately related to such an ontogenetic process of selection. It is as though the algorithm prefigures the digital. The if-then decisional procedure of the algorithm gives rise to the series of cuts typical of the digital process, as the ebbs and flows of the unidentified, unresolved chaos “outside” is sampled, once a threshold of potential is met. Snapshot. The digital emerges in the moment of sampling; it is in the only moment a determinate system touches an indeterminate, un-incorporated preindividual “outside”, that a “bit” comes into existence. Without this moment, the digital is not connected to any process: it is, like the crystal in the mind of the crystallographer, an eternal and infinite expanse of symmetries. Hence, again, the digital seems moved by processes outside of itself. The digital process, therefore, always implies the “analog”. But this crude term is hopelessly inadequate for signifying the monstrous radical contingency that hides in each sampled interval, in between the quantized cracks of our pixels and voxels.

But what is this outside? The outside, in a cybernetic sense, is simply that which interacts with a given system through its inputs and outputs. A system is said to be operationally closed implying boundaries and definite topological connectivity. It will typically be composed of various (continuous) flows, feeding back upon themselves, according to a certain topological arrangement, implying certain thresholds, minima and maxima, varying ebbs of potential. The feedback loops themselves imply recursive processes. For digital cuts to emerge, loops in the process are necessary. The digital bit is born out of a specific decision implied by the structural coupling of some chaotic outside with some defined inside: physicists will call this measurement or observation.

In order for a proper “digital process” to be conceived as a series of cuts, these cuts and breaks have to be recorded or inscribed in some context. The system must somehow change to receive the event. If it does not change, then nothing has happened. As people say when someone makes an unbelievable claim: “picture, or it didn’t happen”. Indeed, nothing happens that is not somewhere inscribed in some context. This is Landauer’s principle: no information without implementation. For something to happen, it must happen to something. The very structure of information pivots on the fact that events leave a trace. To use Bateson’s adage, events make a “difference that makes a difference”. If the event does not get inscribed anywhere, it doesn’t make a difference to anything or for anything, and hence didn’t happen.

It is important to underline that even mathematics is plagued by a fundamental randomness. Chaitin’s famous “Omega” exemplifies this. It distributes all possible decidable and undecidable computations in an algorithmically random manner. And it would seem that in cosmological physics, too, with the advent of the holographic principle, the “observable universe” much like the projection of a hologram, emerges from a discrete two-dimensional set of randomly distributed bits at the Planck scale, corresponding to the surface of the initial inflationary break from symmetry which gave rise to “everything”. Here “everything” should remain in scare quotes, because scientists will disagree as to whether this non-pattern truly does imply totality. Like the “thumbprint of god” (a term once used to describe Mandelbrot’s fractal), the digital in its extremes seems to encounter an absolute contingency “outside of itself”. The “observable universe” is bounded by pure ungrounded randomness; and all process is a coupling of this randomness and some procedure for resolving it. The absolute outside implies the symmetry of pure randomness. And randomness is dialectically opposed to the asymmetry of the observer and the event observed. Similarly, the digital always already implies the procedural abstraction, the algorithm, that leads to its sampling and resolving.

Whether it be the decisive moment of a discerning subjective judgment, or the quantized capture of an electronic converter, the discrete’s process is extrinsic to it. Contingency is not digital. Change is not digital. Process is absolute difference; it shuffles things around when you’re not looking, not selecting, not judging, so that the next time you peak something has changed. Difference, in itself, does not (yet) “make” a difference. It “makes” a difference in the moment of the break, the rerouting of the topology of the circuit, when it inscribes the event into some context: an intensity is measured, an aesthetic is judged, a value is attributed, a signal is sampled.

It seems to me the movement from the “prediscrete world”–that of our archaic ancestors, for example, who did not have a word for each object around them–to the grammatized world, can be modeled in the fashion of the process leading through a simple analog-digital converter, where a continuous (unified) flow is sampled and transformed into a series of discontinuities. Discretization opens up new possibilities; these are related to the “mobility” afforded to the discrete. The atomists believed that the void, the space between atoms, explained movement. In much the same way, the breaks between the discrete bits of the digital, allow them to become mobile within their space of possibility in a way a continuous flux simply cannot. Digital information can be copied precisely, whereas analog information can only be copied at a loss. These abilities stem from the following fact: as “individualized” objects, bits are disconnected from their surroundings, they are mobile. I say “individualized” to reiterate that, following Simondon, this is a state which no longer harbors any intrinsic potential, for it comes at the term of a process which has exhausted its “preindividuality”. Its potential to become, to change, for Simondon, is predicated on this preindividuality. Thus there is a sense in which, once an object is perfectly objectified, that is, once it can be specified (copied) with absolute precision (as with a collection of digital bits), it does not contain within itself the capacities of process, change, becoming. Hence, any process of such objects to speak of must originate outside such objects.

This again implies that process is something distinct from the digital. When we have digital bits, or when we have purely defined objects, we still don’t have an account of change or time. This is a big problem with object oriented ontology, for example. It is also related to what Quentin Meillassoux describes as hyperchaos: interestingly, he arrives at the conclusion that contingency (ungrounded, unmotivated, indeterminate change) is necessary, because of the non-totalizability of the cosmos (the whole), as per the paradoxes of set theory. If the cosmos itself cannot “contain” itself, or if there can be no absolute whole of everything, he reasons, then we must posit the necessity a kind of ungrounded change that escapes the determination of any holistic set of possibilities or laws.

In Deleuze there is further support for this idea. Pure difference, difference in itself, is a positive concept for Deleuze; it is not subordinated to representation, identity, negation, and basically all the features required for the digital domain’s ability to be specified to absolute precision (or precisely copied). At issue is more than just a relation between digital and analog or discrete and continuous. The pivotal relation is rather between “difference” and “difference that makes a difference”. In Deleuze, difference is never given; rather, it is that which gives the given. Difference has an ontological necessity to give us discrepancies and identifiable distinctions. What is given is discrepancies, distinctions, relations of identity and contiguity: what we generally will call “information”. Deleuze insists that as things individuate, difference never gets “used up”. Only diversity, he says, is truly reduced in the processes. Difference stays intact, he insists, surviving all possible transformations. Process, change, time, stem from what we might call the “ontological incontinence”, a fundamental unrest of difference in itself, that is, before it gives the given or “makes a difference”. This is also related to his concept of “quasi-causality”, which conditions events through “surface effects”, yet, which is “unproductive” in a way amenable to how difference does not get used up in causal process. Notably, he claims, after Valery, that the deepest level of being is the skin, implying that such surface effects, which are “causally unproductive” are that which give us causal relations in the first place.

Which brings me to my point about Chaitin’s Omega. Omega is really a formal mathematical object, constructed in order to show the “probability” that a given program of N bits is decidable or undecidable. This is in the spirit of Turing and the decision problem, formulated as a question of whether a device churning through a computation would ever come to halt on a definitive “yes or no” result. We know since Gödel that mathematical systems are “incomplete”, because they allow expressions that are consistent yet cannot be proved. This responded to Hilbert’s first and second questions: mathematics is incomplete if consistent. Hilbert’s third question remained: is mathematics decidable? Turing’s machine was a thought experiment for responding to this. (It is very telling that his thought experiment actually paved the way for modern computing: the “general purpose computer”, is what is mathematically called “Turing Universal”). As it turns out, using a technique similar to Gödel, but inscribed into the procedure of the machine, he showed that the decision problem is unsolvable, that is, there is no algorithm or shortcut procedure for knowing in advance whether a certain computation will halt, or whether it will keep on computing forever, alternating eternally between true and false, following the liar’s paradox. Chaitin sees this as fundamental, and extends it with Omega: what is the probability that a given program will halt? It turns out that this number, for programs of any arbitrary length, is a seemingly infinite and random stream of digits: even if you knew the number to the Nth placeholder, you would still have no way of deducing what the next digit would be. It has maximal “algorithmic information content”: each number in the probability is a singular, irreducible event, which is “independent” in the mathematical sense, uncorrelated to the other numbers in the sequence. The probability of a given computation being decidable is thus random in this sense.

To return to our “thumbprint of god” analogy, it is as though, not only physics, but also mathematics, is undergirded by a fundamental randomness. I see this as equivalent to Deleuze’s idea of quasi-causality: it is this unproductive conditioning of everything, from an absolute ungrounded difference outside of the system. It seems everywhere we dig, we are confronted with this randomness. Chaitin notes that this denies Leibniz’s principle of sufficient reason and ultimately limits the purview of Occam’s razor. Yet in another sense I think it gives substance to Leibniz’s cosmological argument. But we should change his question from “Why is there something rather than nothing?” to “why is there non-randomness rather than randomness?, or “why is it there seems to be stability, locality, law-like repeatability, rather than absolute contingency all the time?”. I think it has something to do with observation itself. Randomness is that which requires no explanation. No one ever looks at a randomly distributed array of objects and asks: why are these objects randomly distributed? What requires explanation is that which diverges from randomness, because statistically, a non-random distribution is less probable than a random one. We are very good at identifying patterns. Humans, and indeed living organisms generally, are pattern recognition systems. In fact we are so good at it that we see patterns (coincidence) even when things are random (see Kahneman and Tversky). As we sample the “chaotic outside”, as we cut off a bit from its preindividuality, and endow it with mobility and nameability, it becomes a determinant in time, an anchor, an irreversible fact, a point of equivalence between potential and actual, and is encoded in our very structure as pattern-recognizing agents. In quantum physics, this will be called the “collapse” of the wave function, or better, “decoherence” (check out Zurek’s work), in which the observer, the act of measurement, or the event of interaction is central to the transition from randomness and non-locality to predictability and locality. Observation causes a bias on reality: a “predictability sieve” as Zurek calls it. Once a particle is “entangled” with another, their futures are interdependent and correlated. On the quantum level a thing is here AND there at the same time. In the classical level of reality (where we carry out our lives), a thing is here OR there. Decoherence is the transition from the “and and and…” to the “or or or…”.  Similarly, irreversibility and process emerges in the moment we measure. The event is the Aion: it cuts the reversibility of the Chronos and gives us the irreversibility of the before and after. As in Hölderlin’s caesura, the beginning and end no longer rhyme. The “and and and” is difference in itself. The “or or or” is difference that makes a difference for something or someone.

To me all this suggests that process is necessarily “outside” the category of the digital.

Our stream at London Conference in Critical Thought 2015

Noötchnics presents: Noology and Technics: Algorithmic governmentality, automation and knowledge in the age of the digital economy

In developing her notion of algorithmic governmentality, Antoinette Rouvroy (2013) has recently questioned how regimes of operativity replace regimes of truth in an increasingly seamless and immanent reality. She defines ‘Big Data ideology’ by fluidity and reliability in opposition to doubt and hesitation, as it is developed through a data-oriented and quasi-exhaustive objectivity. Rouvroy’s diagnosis leads to the question: how is knowledge organised today in the age of neoliberalism and the digital economy?

This stream aims to revisit the questions of Ideologiekritik, as proposed by Deleuze and Guattari in 1980 who discarded the term of ideology for noology. At that time, noology was chosen to enlarge the understanding of ideology based on an impersonal conception of thinking and to do away with a certain Marxist legacy. To them, noology is not merely the study of the ‘inverted world’ and mobilized idealism that the masses need to be emancipated from, but rather the study of the images of thought and how they are wired into society. In short, it is not the brains that should be cured from ideology (such as religion or capitalism) but the modes of production. To be more precise, it is the machines and devices composing the noosphere that should be reimagined. That this noosphere is also a mecanosphere has never been more explicit: in today’s digital economy, relations of production are modulated by algorithms that predict behavioral responses to the market, pre-empting the ???? in the automated control of drives and desires.

Given this context, critical theory, after Benjamin, Adorno and Feenberg, is neither a matter of opposing nor describing the contemporary world as being post-thought (or post-poetry). What forms do ideology critique take in the age of the digital economy? What modes of production and consumption are forgotten or political unthought? Noology implies ‘taking thought seriously’, and if today the images of thought are explicitly encoded in the algorithms of social control and modulation, our task is that of both diagnosing these nootechnical assemblages in their functional composition and inventing new nootechnical assemblages.

We welcome contributions on such topics as the Big Data ideology, critical theories of the digital, the restructured post-2008 economy, algorithmic governmentality, the sharing economy, artificial general intelligence, digital labour, cognitive capital and accelerationism.

Please send abstracts for 20-minute papers to with ‘Noology and Technics’ in the subject line. Submissions should be no more than 250 words and should be received by the deadline of Monday 16th March 2015.

Meetings in Radical Aesthetics: #1 Conspiracy

Alexandre St-Onge and I are pleased to announce the inauguration of a new series of conferences and workshops: the Meetings in Radical Aesthetics.

The idea is to bring people together and discuss radical, speculative or philosophical ideas about aesthetics on a regular basis. The Meetings in Radical Aesthetics are part conference, part research group, informal and open to the public, where participants are asked to be somewhat prepared, and arrived poised for discussion and debate. For each meeting, two artists or researchers are paired up and present a talk on a predetermined theme. This first meeting will bring together writer and stage director Jacob Wren with intermedia artist and York University professor Marc Couroux around the notion of Conspiracy.

Friday, February 20th 2015 / Vendredi, 20 février 2015
7pm / 19h00

Studio Élan d’Amérique
5445 de Gaspé
Suite 324
H2T 3B2

Suggested Readings (prior to the event)

Marc Couroux

Marc Couroux is an inframedial artist, pianistic heresiarch, schizophonic magician, teacher (York University, Visual Arts) and author of speculative theory-fictions. His xenopraxis burroughs into uncharted perceptual aporias, transliminal zones in which objects become processes, surfaces yield to sediment, and extended duration pressures conventions beyond intended function. His work has been exhibited and performed internationally (Amsterdam, Berlin, Chicago, Glasgow, London) and published by Manchester University Press. With Asounder, a sonic tactic collective, he coordinated the (un)sound occupationworkshop (collapsing sound and politics) in Toronto in 2013. He is a founding member of The Occulture (with eldritch Priest and David Cecchetto), a Toronto collective investigating the esoteric imbrications of sound, affect and hyperstition through (among other constellating ventures) Tuning Speculation: Experimental Aesthetics and the Sonic Imaginary, an ongoing workshop with yearly iterations, and the Sounding the Counterfactual stream at the 2014 London Conference in Critical Thought (a blog at documents their evolving thought-forms). Recent talks occurred at the Signal Path workshop (New York, Center for Transformative Media, Parsons), Kingston University (London), Goldsmiths, University of London and the Aesthetics After Finitude conference in Sydney, Australia. His hyperstitional doppelgänger was famously conjured in Priest’s Boring Formless Nonsense (Bloomsbury, 2013). He tweets as @xenopraxis.

Jacob Wren

Jacob Wren makes literature, performances and exhibitions. His books include: Unrehearsed Beauty, Families Are Formed Through Copulation, Revenge Fantasies of the Politically Dispossessed and Polyamorous Love Song, a finalist for the 2013 Fence Modern Prize in Prose and one of The Globe and Mail’s 100 best books of 2014. As co-artistic director of Montreal-based interdisciplinary group PME-ART he has co-created the performances: En français comme en anglais, it’s easy to criticize, Individualism Was A Mistake, The DJ Who Gave Too Much Information and Every Song I’ve Ever Written. International collaborations include: a stage adaptation of the Wolfgang Koeppen novel Der Tod in Rom (Sophiensaele, Berlin), An Anthology of Optimism (co-created with Pieter De Buysser / Campo, Ghent), Big Brother Where Art Thou? (a project entirely on Facebook co-created with Lene Berg / OFFTA) and No Double Life For The Wicked (co-created with Tori Kudo / The Museum of Art, Kochi, Japan.) He travels internationally with alarming frequency and frequently writes about contemporary art.

Knowledge Exchange Lecture at Goldsmths London



We will look at aesthetics, media, technology, and the analog-digital divide from a materialist and “information theoretic” perspective, with the goal of illuminating the role of experience in the “anthropocene”, the age of ecological crisis and of the looming “post-human”. How might aesthesis be understood if subtracted from human judgment? What does experience consist of out there “in the wild,” beyond the human? What are its minimal criteria? The recent and under-examined materialist revival of “panpsychism” in the context of neuroscience and quantum physics (integrated information theory of consciousness) makes several claims about these minimal criteria of experience. We will compare these to the standard streams of discourse on aesthetics and subjectivity, articulating our investigation around examples drawn from the arts and media.

Alexander Wilson’s research investigates aesthetics beyond the human, in the mechanisms of emergence in nature, the science of complexity, computation, systems theory, contemporary cosmology, evolutionary dynamics and philosophies of process. His recently submitted dissertation (UQAM), Art in the Chaosmos, engages the philosophies of Whitehead, Deleuze, Guattari, Simondon, and Stiegler. He has published on various connected topics, including speculative pragmatics and speculative aesthetics. His interdisciplinary art practice deals with related concepts in performances and installations. As co-founder of Parabolik Guerilla Theatre, he also directed several multimedia works for the stage. He has formerly held positions at Concordia University in Montreal.

General Organology Conference

As a founding member of the Noötechnics collective, I am very pleased about our upcoming conference, General Organology.

General Organology

The Co-individuation of Minds, Bodies, Social Organisations and Technè

20-21-22 November 2014, University of Kent, Canterbury, UK.

Download Programme Conference

Register Online (for both guests and speakers)

Marking the 20th anniversary of the publication of Bernard Stiegler’s landmark book, La Technique et le temps 1, which first outlined the project of a general organology, this conference aims to survey the range of twentieth-century and contemporary philosophical accounts, scientific theories and technical innovations that intersect an organological dimension. Within this overarching theme, the goal of the conference is to weave together different perspectives and disciplines from neurosciences to ecology, from the digital humanities to psychology, in order to identify and address contemporary issues that twenty-first century philosophies have to consider. The objective is to enrich the philosophical understanding of the interrelations between natural, technological, psychological and social individuations in order to better read our present time and make appropriate plans for the future. With this is mind we underline the philosophical priority of the question of knowledge, without confining it within merely cognitive bounds.

Over the last decade, we have witnessed spectacular progress in two fields of knowledge, namely digital technology and the neurosciences. These two fields of theoretical and practical knowledge are revolutionising all domains of human life, from economy to health care, from art to politics. Contemporary philosophies are urged to respond to these transformations. Not only are the effects of these phenomena fully transdisciplinary. In as much as digital technologies and brain sciences aspire to transform the human dimension of knowledge, the question of how to transcend neurocentrism and technological determinism remains. Both digital technology and neuroscience are reconfiguring a spectrum of issues with which philosophy has always been concerned, but which it now risks failing to address in their renewed form. These include the notions of desire, memory, imagination, the collective, and the role of writing, grammatisation and language itself.

Keynote Speakers:

Confirmed Speakers:

  • Jiewon Baeck
  • Riccardo Baldissone
  • Mariana Casanova
  • Patrick Crogan
  • Martin Crowley
  • Yuk Hui
  • Ian James
  • Ilan Kaddouch
  • Ganaele Langlois
  • Pieter Lemmens
  • Michael Lewis
  • Gerald Moore
  • John Mowitt
  • Carlos Natálio
  • Ali Rahebi
  • Ben Roberts
  • Estrella Rojas
  • Dominic Smith
  • Ben Turner

General Organology is organized by Noötechnics collective in collaboration with the Centre for Critical Thought, University of Kent (UK).

Recent interview with ere] Magazine

Interview with Ben Burnett of [HERE] MAGAZINE
[HERE]: When and how did you start making art?

Alexander Wilson: I’ve been producing media works for about 12 years, performance and theatre for about 7 or 8 years, and I’ve been making music in various forms since I was a teenager. In the past few years have I been finding ways to integrate all of my aesthetic influences and interests. I think one of the things that got me into art was its ritualistic aspect, the way it creates a space and time of criticality and reflexion. I always loved the subtlety of the questions being posed in art, as though art was specifically geared to problematizing those more abstract realities we deal with, that cannot be treated with quantities and scientific reductions. Art, for me, is essentially a concern with sensation, affect, and in a certain way contingency, accident, eventfulness.

[HERE]: What is the message behind …of the trace? Can you give a rundown of the idea behind the exhibit?

Alexander Wilson: I’d rather not think of the work as having a “message”. I try to distance it from my intentions. Desubjectivise. I try to have the work emerge from the process. Its initial ideas come from observations of today’s hypermedia society, where everyone is walking around with TV broadcasting stations in their pockets. More than ever, our day to day behaviours are leaving traces behind in a kind of super-organismic meta-memory: our thoughts and actions are constantly being uploaded and immediately indexed for future retrieval. The “PRISM” affair is a timely example. But we are all participating in this indexing of what used to be spontaneous and provisional gestures into this global hypomnemton, where they become crystallized in history. There has emerged a certain “banality of the trace” in the same sense as Hannah Arendt wrote of the “banality of evil”. We have become accustomed to being recorded, indexed, google-searched. Cyber-bullying is another example. However, with …of the trace, this idea is a starting point. I then try to create a kind evocative microcosm for it. A laser system scans the surrounding architectural features of the space, tracing lines that conform to the movements and bodies of the people in the room. (Visitor’s bodies and gestures are captured with a video camera. Custom software extracts their outlines and morphs them with the outlines of the architecture. These lines are then traced back onto the architecture with a laser scanner. The algorithm accumulates and replays the visitor’s motions at some moments. At other moments the reactivity happens in real-time. It is hence very sight specific, since every space will give a very different result.)

[HERE]: Your exhibits are so indebted to the work of sound and specifically sound in a place, where do you draw your influence for these public displays of sound?

Alexander Wilson: Though …of the trace might has some elements in common with the digital minimalism of audiovisual artists like Carsten Nicolai and Ryoji Ikeda, I am much less formalist in my approach. My involvement in performance with Parabolik Guerilla and in electronic music improv with, for example, K.A.N.T.N.A.G.A.N.O., feeds an element of provisionality into my audiovisual works. I want the installation to keep some of the rawness and spontaneity I find in the experimental music and performance art scenes, which are less contrived than many currents in the new media. The images produced here are more organic than geometric, and the over-all feel is more analogue-like. The synesthetic use of audio signals to drive a laser-steering mechanism will remind some of Robin Fox’s or Edwin van der Heide’s shows, and to some of my own performances (The Fine Tuning). But in this piece, the audio signals driving the laser are made to conform with the architecture. So in a way, the image drives the sound, which in turn drives the image. The noise visitors will hear in the installation is the actual signal used to trace the lines.

Asymmetry and Subjectivity

The following is an exerpt from: Pharmacology of the Feedback Loop, Alexander Wilson, 2012

Even if the question of the symmetry of time is still a field of debate in the philosophy of science, it is undeniable that, if we agree with the probabilistic monism of contemporary science, and posit that the universe is time-symmetric or that it is temporally reversible, then it becomes difficult to explain temporally-directional phenomena like the arrow of time in thermodynamics, or the outward propagation of the wave, without recourse to some notion of subjectivity. We are to understand that if we see the universe we see, as irreversibly moving in the direction of higher entropy, it is because we are oriented within it: we are negentropic organisms, and thus pointed in the direction of the universe’s expanding and cooling. Our being-toward-death, is perhaps nothing more than a reflection of this negentropic resistance to entropy. The way contemporary physics sees it, this orientation in time corresponds to our asymmetrical selection of the universe due to our thermodynamically oriented nature, whereas “objectively”, time is perfectly symmetrical and all processes are reversible.

Now let us recall the argument against the absolutization of totality, articulated on the theory of transfinite sets, that Meillassoux (after Badiou) develops to defend his speculations on the necessity of contingency. His argument is that it is impossible to totalize the all, for according to Cantor’s set theory, the set, by definition, contains itself, its full set, as a subset of itself, in addition to its other subsets. This means that the set of all sets contains more than it contains, which is evidently paradoxical. We thus imagine an infinite series of inter-embedded sets of increasing cardinality, the set of all sets perpetually being surpassed by its contents, outnumbered by its parts. Moreover, the impossibility of totalizing the set of all sets poses an obvious problem for any cosmology that defines the universe as time symmetric. For the time symmetry of the universe rests on the possibility of a temporal objectivity, in the sense of a point of view that sees all, or in other words, a set of all sets. This suggests that we should question the plausibility of the current posture in cosmology, that insists that the asymmetry we observe is illusory in relation to symmetrical reality, since in order to be symmetric, it must also contain itself as an objective totality.

It is only from the subjective point of view that there is asymmetry, and only from an absolutely objective non-point of view that there is pure symmetry. But it is my contention that, following the paradoxes of transfinite sets, both of these poles reveal themselves as untenable. Meillassoux is correct in that there can be no pure objectivity, for the set of all sets cannot be totalized. But what Meillassoux misses is that this argument holds for all sets, not just the set of all sets. So it also means that there can be no pure subjectivity, no pure closedness in the first place, because by definition, subjectivity, as a subset of the whole, also cannot contain itself. It is not closed: it escapes itself through its parts, bootstraps itself out of itself. Subjectivity as asymmetry presupposes the correlationist exclusion and veiling he opposes. Paradoxically, it is the same self-veiling nature of (self) observation, which Meillassoux opposes in the individual (and which he reduces to linguistic correlationism or idealist solipsism) that allows in the first place for his argument against the totalization of the set of all sets, for if the whole can never be totalized it is because “it must first cut itself up into at least one state which sees, and at least one other state which is seen. […] In this condition it will always partially elude itself.”

The universe can thus not be objective, for objectivity is outside of time (which is subjective) and can thus not account for the contingent event. But there is a sense in which Meillassoux’s argument for a hyperchaotic universe leads him to defend a position as untenable as that of the solipsist.[1] For now hyperchaos can thus be applied nowhere and to nothing: it does not allow for the actualization of the event, for no set can contain itself enough to be affected by an event, no set can loose itself to the event for it has nothing to loose in the first place, or more precisely, it is always already lost to a single event of self observation. As Deleuze would say, nothing in this paradigm becomes the event’s quasi-cause, for there is nothing for the event to happen to. Indeed, hyperchaos is the inversion of soplipsism: to the berkleyian idealist solipsism of an anti-symmetry and absolute closedness, where nothing really happens, hyperchaos substitutes the absolute symmetry of the non-totalizable openness of the non-all, where there is nothing to happen to.

But the pharmakon seems to point us toward a third alternative, one that allows us to understand the world as a hybrid of closure and openness, between hyper solipsism and hyperchaos.[2] Objectivity and subjectivity only obscure the depths of the strange attractors that tessellate the pre-individual field. We should avoid reducing everything to one or the other, to the marked or unmarked, for it is what happens in the interstice that counts : there is all the richness, the complexity, the consistency of the world.

So, to the open and hyper chaotic universe of Meillassoux, and to Luhmann’s universe of inter-embedded closures, I believe we must oppose the universe that Deleuze, after Joyce, called a chaosmos. A universe that is intrinsically hybrid, that is stratified between chaos and order, between openness and closure. It becomes equally nonsensical to consider time as a process that passes from the past to the future. Indeed, according to Deleuze, time can only pass from the virtual to the actual, and this, by traversing the intensity of the plane of consistency. All the possible results of the dichotomy of self-other, subject object, interior-exterior, remedy-poison, dissolve into the consistency of the process of actualization, just as the inside and outside unfold out of the pharmakon. And the mistake is to always try to reduce everything to one of the many attractors in the chaosmos. The universe is hybrid, and the pharmakon is its consistency.

The Big Bang might be construed as the fixed point attractor of objectivity. Like the first distinction, or the arche-trace, it is a substantive mark, and can only supplement actualization. It holds all of becoming in a single point, outside of time (the cosmological singularity). But each moment of self-reference produces its own intensive singularity that bifurcates from the spatial plane and produces its independent temporal axis. There is therefore, strictly speaking, no Big bang, for all the little bangs that compose it resist their totalization. The universe would be better understood as a web of strange attractors, chaotic attractors that have as their first characteristic to be withdrawn and plunged into the infinite regress toward a multiplicity of initial conditions.

Let us take one last example before we conclude: according to the discoveries of the theory  of complexity, such as in the work of Stuart Kauffman, self-organising systems seem particularly attracted to the edge of chaos.[3] A self-organizing system is not simply selective and retentive, it is not simply negentropic: in order to be efficient, it must situate itself on the edge of chaos. A system that is too ordered remains fixed and crystallized, incapable of evolving, of varying, of creating new forms that can survive in the changing environment. By contrast, a system that is too chaotic cannot hold enough order to contain itself, not enough structure to inscribe a memory, a logic, a program. But between these two poles, there exists as third regime: where order is liberated from the fixed point attractor and rises into differentiation, but also where chaos allows itself just enough body to not evaporate and loose itself. Christopher Langton, in the 90’s, even discovered that the universal computer potentially emerged in the fine layer that separates the regime of order from the regime of chaos. For it is there, between order and chaos, that virtual computing can spontaneously occur, as described by Alan Turing in 1936: a virtual machine that would read both the description of the machine it simulates and the data it computes from the same series of variations inscribed on the edge of chaos.

So if we grant the edge of chaos this potential capacity to virtualise autonomous universes, to simulate them, as does Turing’s machine, then the self-organizing universe escapes itself perpetually in its race toward higher entropy. The edge of chaos is thick, for if the virtual is reborn on this edge, if little universes are simulated somewhere between the birth and the heat death of the universe, it also means that perhaps, somewhere on the path to higher entropy, the universe reaches a zone where the system’s behavior bifurcates into heterogeneous dimensions. This recalls physicist Lee Smolin’s theory of the fecund universe. He speculates that each black hole in the universe produces its own Big Bang behind its event horizon, with its own forces and constants. Each universe only produces offspring (little bangs) to the extent that it produces black holes. The black hole acts upon the system of the universe, somewhat like the self-referential statement acts upon the mathematical system. Smolin’s black hole is the pharmakon of the universe, its exteriorization, its escape, its flight from self, just as Gödel’s self-referential statement is the pharmakon of mathematics. The pharmacological is composed of this fractal-like web of inter-embedded holes and strange attractors that weave themselves around each process of individuation and constitute its preindividual field.

The pharmakon in this sense relates to the concept of the ()hole complex in Negarestani’s Cyclonopedia, where each whole escapes itself through its own self-referential (black) holes, that form where cyclones and spirals conspire and become complicit, sharing inward folds and spires, and perpetually defer totalization.  The point here is that the pharmakon should no longer be reduced to a mere undecidability between outside and inside, between poison and remedy, for it is rather the consistency of individuation itself. Absolute closure and absolute openness are nowhere to be found. There are only pre-individual individuals, there are only partly discrete identities, there are only partly continuous continuums, for the fabric of the universe is intrinsic hybridity, and the body of becoming is composed not of organs, but of holes, escape tunnels and lines of flight.

(read full paper here)

[1] I agree here with Peter Hallward’s argument against Meillassoux :

Peter Hallward. Anything is Possible: A Reading of Quentin Meillassoux’s After Finitude dans Bryant, L. R., Srnicek, N., & Harman, G. (2011). The speculative turn?: continental materialism and realism. Melbourne, Victoria, S. Aust.: p. 130-141


[2] Against the predominiance of the closed system in second order cybernetics, Mark Hansen has proposed an interesting alternative, with which the pharmakon aligns perfectly, that he calls the system environment hybrid, drawing from, among others, the Simondon’s individuation and Guattari’s notion of the machine.

Mark B. H. Hansen, System-Environment Hybrids dans Clarke, B., & Hansen, M. B. N. (Eds.). (2009). Emergence and Embodiment: New Essays on Second-Order Systems Theory. Duke University Press Books.


[3] Kauffman, S. (1996). At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (1st ed.). Oxford University Press, USA.


Parabolik Guerilla’s Organon : set to première in winter 2014

It looks like the new stage production I’m working on, Organon, will debut in winter 2014. This gives us a year to complete it. I’m often amazed at how long it takes, in the Montreal context at least, to get stage productions off the ground. We have been working on this in a pre production mode for almost three years already!
Organon is an upcoming stage work I’m co-directing with Mélanie Verville and Parabolik Guerilla Theatre. The performance engages with the concepts of emergence and evolution, and is more specifically concerned with the processes that occur in the interstices between the various levels of organization that make up the complexity we observe in nature, from sub-atomic interactions to psycho-social organizations. The transfer of memory and the acquisition of knowledge (cognition) is herein regarded as being symmetrical with the abstract information transfers that occur from one level of organization to an other, in the energy-dissipation driven self-organization of the cosmos.
Performers: Catherine Cédilot, Mélanie Verville,
Live Music: Félix Antoine Morin, Alexander Wilson
Set & costumes: Genviève Boivin
Direction: Mélanie Verville, Alexander Wilson.

« Page 1, 2 »