Conscious(ness) Realist

Publication Reviews and Commentaries
by Larissa Albantakis

Dynamical Emergence Theory


Consciousness as phenomenal experience is structured. Research into the neural correlates of consciousness and its contents may reveal the neural mechanisms that support conscious experiences. However, knowing which type of neural population activity corresponds to which contents of consciousness does not yet provide an explanation for why the experience feels the way it does. Dynamical Emergence Theory (DET) aims to address this issue by proposing a link between the structure of a system’s collective dynamics and that of the experiences it is capable of producing.

To that end, DET combines aspects of Integrated Information Theory (IIT) (Oizumi et al., 2014) and Geometric Theory (GT) (Fekete, 2010), an earlier proposal advocated by the same authors. DET starts with two guiding principles, “Inherence” (observer-independence) and “Structure” (a formal isomorphism between phenomenal structure and the structure of the system’s emergent macro states and their transitions). These necessary conditions are also deemed sufficient for phenomenal experience.

Why discuss this paper?

There is a strong tendency in the field of consciousness science to focus solely on the objective aspects of consciousness such as reportability, neural activity, and related functions, to the point that some even consider the subjective aspects of consciousness outside the realm of science (Cohen and Dennett, 2011; Doerig et al., 2019). However, the subjective character of consciousness can and should be studied in objective terms.

[IIT has set out to do so by evaluating the essential properties of experience and postulating a fundamental identity between the phenomenal structure of an experience and the cause-effect structure of its physical substrate (see Haun and Tononi, 2019 for a first attempt to map the experience of spatial extendedness to the cause-effect structure of a spatially organized, grid-like neural network, and Tsuchiya et al., 2019 for a more general discussion.)]

As Moyal et al. put it: “The veracity of any proposed mapping between a system’s dynamics and phenomenal content is testable, even if one is only willing to admit a strictly operational definition of awareness.” Specifically, DET postulates a formal isomorphism between the structure of phenomenal consciousness and that of its underlying neural population dynamics, which have to be assessed in an observer-independent manner and thus have to be intrinsic to the system in question. Sounds familiar? DET shares essential aspects of IIT, but also diverges from IIT on important issues. Discussing the similarities and differences may shed some light on both approaches.


What is DET? DET starts with two basic principles, “Inherence” and “Structure”. As the name “Dynamical Emergence Theory” suggest though, DET rests on the additional, implicit assumption that physical structure must correspond to a certain type of emergent dynamics. Specifically, it is assumed that changes in qualia correspond to transitions between separable macro states that emerge from the system’s collective dynamics. This assumption is not inferred from phenomenology itself, but instead largely rests on empirical observation (Moyal and Edelman, 2019) and a commitment to a multiply-realizable computational (as opposed to implementational) substrate of consciousness. Nevertheless, these properties together are taken as necessary and sufficient: Any physical system with collective dynamics that are captured by intrinsically discernable macro-state transitions possesses some degree of phenomenal experience.

Formally, DET proposes to evaluate phenomenal experience based on three measures: Representational Capacity (RC), the Amount of Experience (AE), and the Nature of Experience (NE). These measures are supposed to summarize various aspects of the structural properties of the computational substrate (CS), which corresponds to the possible macro-state transitions. The measures are not taken to be unique, but rather intended as one set of computationally tractable, practical measures with predictive and explanatory power and are only sketched in the present paper.

[Roughly, the RC represents the overall level of consciousness or arousal of a system, which is thought to depend on the topological complexity of the system’s dynamical structure taking all possible trajectories into account over a certain finite amount of time. The AE aims to capture the richness of the experience, which is constrained but not fully determined by the RC (compare for example a rich multi-sensory experience against silent meditation with closed eyes). Formally, it is proposed that the AE might correspond to the topological complexity of an individual state space trajectory, evaluated, for example, by convergent cross mapping through time-delay embedding (Sugihara et al., 2012). While the RC and AE are scalar measures, the NE should provide a structure that captures the similarity between experiences (across time or also across different systems) based on the structure of a system’s CS-level macro-states and transitions.]

Experience in time — Structure right now?

Can an approach that evaluates a system’s dynamics over a certain period of time capture the structure of experience? After reading through the first two sections of this paper with enthusiasm, I realized that what I understand as the structure of experience does not seem to be addressed at all within the DET (as described). Right now, I sit in front of my computer, there are books and papers and coffee cups around me, all positioned in their place, with their respective colors; I am looking at the text that I’m writing, while I hear the clicking of the keyboard etc. etc. My experience right now is structured. However, according to DET my whole structured experience right now corresponds to one macro state. The NE measure is then supposed to capture similarities between experiences by similarities between macro states.

Granted, any physical property associated with experience should have a similarity structure that is isomorphic to the similarity structure between experiences and that is an important insight that goes beyond mere neural correlates. However, there is a gap between matching the similarity structure across different experiences and accounting for phenomenal content of a particular experience. How do I get from the macro state as a node in the NE to why that macro state feels the way it does?

Going back to Fekete and Edelman (2011), the authors discuss this issue at some length and conclude that transient activity patterns must “somehow carry within them the causal constraints that force their orderly instantiation.” This certainly matches the idea behind IIT’s state-dependent causal analysis. Fekete and Edelman instead conclude that an experience must correspond to a temporally extended dynamical transient. To match human experience such a trajectory segment “should extend anywhere from hundreds of milliseconds … to seconds …”. In the current paper, however, the same interval is thought to correspond to a discrete macro state.

In any case, the content of the experience in DET (and GT) seems to depend on the dynamical structure of the system far beyond the 500 ms time interval. At the same time, the authors emphasize actuality, criticising IIT’s dependence on all possible counterfactuals, including states that the system might never visit. But how could the content of my current experience depend on my actual past and future life trajectory (beyond being the system I am right now)?

[In this respect, I couldn’t help to notice that the algorithm outlined in (Allefeld et al., 2007) and cited here as a way to map micro states to macro states assumes that it is possible to reach any system state from any other state.]

Finally, the issue with dynamical trajectories is that they have to be defined over a certain time interval. The three measures proposed to evaluate the complexity and structure of the trajectory space certainly depend on that time interval to some degree, but the paper only refers to “some time interval of interest” throughout. However, experience is what it is and the time interval to determine the content of experience cannot be arbitrary in the end; this would also violate DET’s Inherence principle. Consciousness is definite in its content and its duration. This is IIT’s exclusion axiom. And while nobody seems to like it (including Moyal et al.), exclusion does a lot of work when it comes to accounting for experience in a non-arbitrary, consistent framework and DET is lacking at least some of its aspects (see more on this below).

The boundaries of experience

DET distinguishes between the implementation level: the system defined as a set of elements with variable states evolving over (continuous) time, according to a set of differential equations, and a multiply realizable, emergent computational level. This macro level is supposed to be self-organized and observer-independent, stable over time, discrete, and connected to the micro-level description through a mapping that preserves the topological structure (Allefeld et al., 2007).

By moving phenomenology to the macro level, the hope is to circumvent the boundary problem of consciousness (addressed in IIT by the “Exclusion” axiom and postulate) and to allow for multiple realizability of phenomenal experiences (in Fekete and Edelman, 2011, macro states were not explicitly required). Instead of causal exclusion, all inherent macro levels are considered simultaneously valid and, if I understood correctly, are thought to contribute jointly to the experience of the underlying physical system. According to DET, the experience and its corresponding CS thus span multiple levels of organization.

The first question here is why the micro level or implementational substrate (IS) should not also contribute? Possibly because it is thought to be continuous, whereas the macro level or computational substrate (CS) is required to be discrete, or rather “quasi-discrete”. Otherwise, there really is no reason to exclude just this one level from the hierarchy.

[The issue whether physics at the bottom is discrete or continuous is an ideological question that, as of yet, has no decisive answer within fundamental physics (to the best of my knowledge—let me know if you happen to know more).]

But how many levels would DET predict and how separable are those levels? We are, after all, not conscious of the interactions between molecules in our brain or over temporal scales of more than a minute. Would there really be no quasi-discrete macro-states corresponding to these levels?

More critically though is the question what delimits the IS in the first place? The DET does not provide an answer here, which ultimately makes it incomplete: it does not solve or even address the problem of individuation. Why should I start with the IS of the brain and not that of the universe? Are ISs allowed to overlap? Leaving this issue unresolved is also inconsistent with the Inherence requirement: DET postulates observer-independence but does not provide a principled, observer-independent manner to identify the borders or the IS, which after all still determine the boundaries of experience.

Guiding principle vs. axioms and postulates

While the authors assert the need for an axiomatic basis or a set of minimal assumptions for a theory of phenomenal consciousness, there is a crucial difference between IIT’s axiomatic approach and starting from an (arguably arbitrary) set of guiding principles. IIT’s axioms aim to capture the essential properties of every experience, which are converted into requirements—postulates—for a physical substrate. It is because a physical system that fulfills all the postulates should be able to account for all essential properties of experience that IIT’s postulates should be taken as both necessary and sufficient. DET claims sufficiency, but can it account for the essential properties of experience?

DET’s Inherence largely corresponds to IIT’s “Intrinsicality” axiom and postulate: experience is observer-independent. Thus, whatever physical properties correspond to experience, they also must be observer-independent and intrinsic to the system itself. The rest of DET is then squeezed into the Structure requirement, which contains elements of IIT’s Composition and Information axioms and postulates (see note above). IIT’s remaining postulates are explicitly deemed “in some respects ill-defined, subsumed in the first two [Inherence and Structure], or unnecessary.” Yet, as I hope to have shown above, DET remains incomplete and cannot account for the definiteness of experience across elements or time (exclusion) and arguably also does not capture the structure of my experience right now (composition). Since DET does not delimit the IS, it also cannot account for the unity of experience (integration).

Notable points of agreement between DET and IIT

a) Simulated states are not observer-independent and thus computers aren’t conscious. While DET sets out to be a “computational” theory of consciousness and explicitly aims for multiple realizability of phenomenal experience, physical implementation still matters: “The Inherence requirement, importantly rules out digital computation in its familiar form as a candidate medium for phenomenal experience”, since “representational states in digital computers are defined by convention—by means of an externally imposed mapping between the values of physical variables and the symbols they stand for—and therefore not intrinsic to their physical substrate.” Same reasoning as in IIT, see my blog post on physical vs. functional states.

b) Phenomenal experience is causally effective. DET identifies phenomenology with physical structure in the form of state transitions between causally effective macro states. As such, phenomenology is causally effective, not just an epiphenomenon. The same applies for IIT, where the identity is postulated between the cause-effect structure of a system in its current state and the experience.

c) Multiple realizability. While the approaches differ, causal emergence in IIT (Hoel et al., 2016; Marshall et al., 2018) allows for multiple realizability of phenomenal experiences in a similar fashion as DET.

Then there is “the specious present” by William James and the problem with “fungible” elements, but I will leave those for another time.


The DET provides an empirically based account of consciousness that goes beyond evaluating neural correlates of consciousness as the similarity structure of experience is explicitly taken into account. The authors start from many of the same premises and concerns that motivate IIT and I urge everyone to look into (Fekete and Edelman, 2011). While I don’t agree with some of the conclusion drawn in this paper, the issues that are raised are important and too often neglected. It would be very nice to see the proposal worked out in more detail on some tractable toy examples. Moyal et al. emphasize practicality and aim for measures applicable to neurophysiological recordings. However, the value of toy examples to evaluate whether a proposal is consistent in a fundamental manner, irrespective of empirical limitations, should not be underrated.

Moyal R, Fekete T, Edelman S (2020) Dynamical Emergence Theory. Minds and Machines, 30, 1-25.