Rationalism Is Not Rational

#modelNo mention

Rationalism à la Eliezer Yudkowsky has two connected fundamental flaws – an ontological and an epistemological one. Taken together, they make Rationalism an irrational program.

Rationalism is ontologically naïve

Rationalism is built upon a naïve Model of reality. For the Rationalist, Rationality means organising your beliefs so that they are accordance with Bayes’ Theorem. In other words, it sees Bayes’ Theorem as operating on beliefs.

In reality, beliefs exist only in our imagination: They are parts of an efficient, but simplified and potentially misleading model we have of ourselves – what Sellars called the “manifest image of man in the world”.Sellars (1963)

They don’t exist as components of the Systems that we are.

In actual Living Systems, Bayes’ Theorem operates as a Constraint on overall system behaviour – be it of a cell, an organism, an ecosystem or a civilisation.

Therefore Rationality is an evolved strategy of a whole system, not an attribute determined by the relationship of certain parts of its self-model. Its criterion is evolutionary success, not internal coherence or perceived agreement with collected evidence.

Rationalism is epistemologically naïve

Bayes’ Theorem acts as a constraint on system behaviour and not as an algorithm because it cannot be efficiently implemented in real environments.

Systems instead engage in Free Energy Minimisation; in other words, for them To exist means to minimise surprise. Bayes conformity is the outcome of this strategy, not the strategy itself.See, e.g., Parr, Pezzulo & Friston (2022) .

More practically, living systems exist in complex Environments and uncertainty where Explicit Models and linearly stacked, individual probability calculations are inefficient, not scalable, fragile and will thus in the long run be selected against – not before causing potentially tremendous damage, though.It’s probably no coincidence that Rationalism is so popular with engineers and the wider nerd culture with its solution-focused tendency to take complexity for complicatedness. Effective Altruism’s utilitarian “shut up and calculate” approach (Yudkowsky) has the same problem with complexity.

In this sense, Rationalism is not rational – it is a naïve methodology based on a fantasy.Sandifer (2016) makes a similar point when calling Yudkowsky’s program “literary Bayesianism”.

Rational strategies

What will be potentially successful and thus rational are Implicit Models and iterative strategies based on larger-scale patterns – behaviour and Sensemaking strategies that are not consciously designed, but the result of longer-term natural and Cultural Evolution.

If such evolved behaviour has become maladaptive (e.g. by the co-option through Ideology, particularly Consumerism), the solution cannot be switching permanently to a strategy based on explicit models.

What we can (and should) do is using explicit models and the critique they enable to influence the parameters of cultural evolution so that more adaptive behaviour can emerge.

This is the purpose of Critical Theory and Conceptual Engineering.


There are no links to this note.