**Links to**: [[Turing]], [[Function]], [[Undecidability]], [[Chaitin]], [[Intractability]], [[Identity]], [[Difference]], [[10 Bias, or Falling into Place]], [[Entropy]], [[12 Negintelligibility]], [[Computational irreducibility]], [[Structure]], [[Entropy]], [[Noise]], [[Shannon]], [[Weaver]], [[Cybernetics]], [[Statistics]], [[John von Neumann]], [[Bit]], [[Binary]], [[Combinatorics]], [[Simondon]], [[Trace]], [[Legibility]], [[Gregory Bateson]], [[Difference]], [[Foundation]]. >‘[C]omputation’, index[es] a specific humiliation of the human under the rubric of _Turing trauma_. > >Cavia, “Turing Trauma”, 2024. >I believe that at the end of the [20th] century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted. > >Turing, 1950, p. 442. ### Working definition (not yet a [[Postulate]]): computation is the observed condition where the _contingent_—that is: observer-relative competing processes of differentiation (e.g., organism versus dissipation)—reveals the capacity to develop and follow rules, and thus create equivalence or _sameness_ in order for (conditions allowing) continuing to compute. _A function can only be said to be computable, if and only if, it can be computed by a Turing machine._ This classic understanding of what computation is, is straightforward. Any **function** that can be computed needs to be able to enter a [[Turing machine]] and **stop**, rendering a result. All processes of differentiation, until there appear, on our scene, different ways to access them, can be understood as computations, if we accept open-endedness. A leaf effectively computes differences by parsing sunlight, nutrients, etc., and rendering-running a function that has clear inputs and outputs (although these are not that clear if we consider the leaf within ecological feedback cycles, but the boundaries can be drawn when needed/desired, if only schematically). The fact that certain functions (e.g., sensitivity to light) can be presented as substrate-independent, means we can talk about computation enabling _sameness_ ([[Equivalence]]). The things humans have developed in order to understand and develop rules thus far are often termed _computers_, which in turn inform how we might view computation in other domains. Like with most other concepts in this project, we take an ample take of the concept. Another framing of it could be: _computation is the observation of change through formal means_. Formalities, of course, depend on the domain under analysis, and its the rules and scales (sameness is therefore defined by our biases ([[10 Bias, or Falling into Place]]), and interests ([[06 Principle of Sufficient Interest]])). In other words computation can be understood as the “diagnosis of contingency” (Cavia 2024, p. 10). All observed things can be understood as computing because they imply categorial organization through perception and other complex (techno/logical) means. Regardless of stopping here and there, computations are always open-ended because they are inevitably connected to other computations (not to speak of uncomputable things, and/or results which deal with the chaotic or irreducible: [[Computational irreducibility]]). Computation denotes a dynamic process of differentiation, and has the capacity to be observed at different scales. Therefore, for our purposes, computation is an interesting concept because it evades the life/non-life distinction: all changing things can be understood as information-processes, hence our interest in computation in this research. Computation seems to be the thing that conceptually emerges when we are to talk about ways in which observer-dependently-contingent processes and how they maintain themselves through pattern-reliance, formation and following. Patterns are an inevitable starting point, computing systems _evolve_ patterns to maintain their own existence against other competing patterns. Rules in formal computing are often understood as explicit *instructions* or *criteria* that define how to proceed or what counts as correct, and against what test (infinity/computability, encryption, randomness, etc.). Rules are often understood as meta-patterns (see, e.g., Sellars 1954), that come to _constrain_ patterns, they are *prescriptive*. Whereas patterns are usually understood as _givens_,^[However un-given and constructed they may be.] as “natural” regularities or structures. They are therefore understood as descriptive, representing observable structures that should be scientifically, at least, the same to all observers wishing to put them to the test (“reality is the thing which does not change regardless of my desire to change it”). However, since observing-computing phenomena (human beings, trucks, seas, galaxies) are desiring things^[They have _tendencies_.] which are given to the affordances around them: all patterns are rules and therefore prescriptive. This relates to what we call [[Autosemeiosis]]. What is the advantage of this amorphous pancomputationalism? I have not settled my thoughts on the matter—they are _undecided_—but for now I view this as offering a pragmatic, technical window into something that otherwise remains truly undecidable. Below some unfinished sub-entries. &emsp; ### Mortal computation Because we sleep and die, we do not know what computation _really_ is. See Ororbia and Friston for the concept of [[Mortal computation]]. &emsp; ### Language computers (The bit below is extracted from [[04 Concepts as pre-dictions]] and remains, for now, a mere reminder to continue writing about computation, LLMs, abstraction and negation). L. Horn’s (1989) _Natural history of negation_ opens: “All human systems of communication contain a representation of negation.”^[Horn’s quote continues: “No animal communication system includes negative utterances, and consequently, none possesses a means for assigning truth value, for lying, for irony, or for coping with false or contradictory statements.” , 1. But binary negation of the kind “Stop! / No! / Enough!” that humans employ very often, can certainly be compared to functionally-similar growls, barks and bites in other animals, an easy example are domestic dogs and cats.] We include formal languages and, by extension, computation here: as a specific kind of formalized communication system. To specify the generalized abstraction that we refer to as a computer here (abstracting _beyond_ a capacity to effectuate logical/arithmetic operations): it is anything which arrives at a definite state after the possibility of formal bifurcation. Presented as such, computers, as **difference engines**, are classical negators: based on this simplistic binary logic, one could hardly imagine something more negative than a computer. As explored in [[Negation]], we can think about all niche-formation, including computers, as strategies of uncertainty-reduction (Vasil et al., 2020). This would include, fundamentally, a computer’s capacity to _settle_ on a state when a bifurcation presents. In the interest of seeing languages (both natural and formal) as “artificial contexts” (Lupyan & Clark 2015) through which we probe reality, we can think of (formal) languages (and their effectuations in computation) as the most dynamic prediction-modulations currently available. (This bit below is from much earlier work, at Leiden University). Relatedly, and not dissimilar, here is a reinterpretation of Asimov’s _three laws of robotics_, but for syntactic-semantic computations: &emsp; ![[manual override 3 laws of communication.png]] <small>S. de Jager, Manual Override, 2014. Three laws of language and communication.</small> &emsp; ### Rules and relations >We now and then take pen in hand >And make some marks on empty paper. >Just what they say, all understand. >It is a game with rules that matter. Hermann Hesse, “Alphabet,” trans. R.S. Ellis (Manin, 1977, p. 3, cited in Rapaport “How to Pass a Turing Test” 2000). %% [[Computation notes]]