A couple of days ago, I found myself in a tram in Berlin on the way back home from a Saturday-night ramen excursion. Among my companions was a friend of mine with whom I often engage in banter around various topics that range from social issues, identity, and various `-isms`

, to math, physics, and philosophy. On this particular evening, our conversation veered into a particularly abstract realm: computation.

I was posed with a challenge:

*What is computation? Can you define it?*

I thought long and hard about this. My immediate answer was something like: *computation is the process of elimination of relations from abstractions*. I was thinking of a simple example computation: finding the sum of two integers. This operation takes a relational abstraction (the abstraction `2+3`

relates the integers 2 and 3 in a well-defined way) and converts it to a single symbol (for `2+3`

this is `5`

).

In response, I was introduced to the idea that computation is actually a *linguistic* concept. This baffled me at first, but the more I thought about it, the more it made sense, especially after thinking about the definition of “language” itself.

The collection of regular languages over an alphabet Σ is defined recursively as follows:

- The empty language Ø, and the empty string language {ε} are regular languages.
- For each
*a*∈ Σ (*a*belongs to Σ), the singleton language {*a*} is a regular language. - If
*A*and*B*are regular languages, then*A*∪*B*(union),*A*•*B*(concatenation), and*A** (Kleene star) are regular languages. - No other languages over Σ are regular.

This is not extremely critical to the final definition of computation I came up with, but it provided me with context and inspiration to think further.

In the simple arithmetic relation `a+b=c`

, the `=`

symbol is representing the equivalence of two abstractions: `a+b`

and `c`

. The left-hand and right-hand abstractions of this equation actually follow their own languages!

The `alphabet`

of the language on the left-hand-side is the set of all integers, along with the “+” symbol. A valid abstraction (or *expression*) in this language must contain two “words”: the sum operation has two inputs. More rigorously, to express the sum of two integers as a single integer, the alphabet of the left-hand language requires a set of symbols that can be used to express the complete set of integers, and a delimiter symbol that is not in this first set.

Similarly, the alphabet of the language on the right-hand-side is also the set of all integers. A valid abstraction/expression in this language must contain a single “word”. No delimiter symbol is required.

Here’s my ~~penultimate~~ final definition of computation:

*Computation is the process of converting a given abstraction in an arbitrary language to an equivalent abstraction in another.*

~~I say penultimate because I have one final thought: I don’t think that it’s necessary to mention languages in the definition. Languages are modes of expression, and I’m not sure computation should be restricted to ~~I did indeed think about this more, and I realized that abstractions are ways to express abstract things – i.e. there’s no meaning to the phrase “expressed abstractions”. The word abstraction is a bit ambiguous it seems. In any case, the important part of this definition seems to be *expressed* abstractions. I might change my mind about this after further thought, but in any case, the important thing seems to be *equivalence*. *equivalence*. In order for a computation to be possible, there must exist at least one way to express the abstract thing in both the source and target languages. This makes me appreciate the concept of Turing-completeness more – it standardizes a scope for expression itself and establishes the requirements for an arbitrary language to achieve that scope.

This is very very interesting to me. It seems to cover all the examples I could think of to test it. Here are three of them:

- Binary arithmetic as performed by ALUs uses (in the simplest case) two input registers and an output register. The input registers and the circuitry connecting them to the output register together are a physical abstraction equivalent to
`a+b`

, and the output register is a physical abstraction equivalent to`c`

. I don’t know about you, but I’m seeing a conversion of abstractions here, where the two input registers are the “alphabet” of the left-hand language, the output register is the alphabet of the right-hand language, and the circuit connecting them is an abstraction for the rules of equivalence between them.

- A pseudo-random number generator is a function that operates on a seed and outputs a number. Seems to fit the definition of computation pretty well, similarly to the first example. If I may embark on a tangent, randomness itself is a pretty fascinating non-trivial concept. Implementations of random number generators require a seed or external source of entropy. Very cautiously, I suggest that generation of entropy is not a computational process. Indeed, computation might not be possible without entropy. I will leave you to think about this further on your own time, because I haven’t thought about this enough myself.

- Bear with me as I enter some sketchy territory here: the process of human thought is a computation. The universe is a physical expression of the language of nature (there are branches of physics including quantum field theory and string theory that study the core abstractions involved). Humans express their thoughts in human languages. Thoughts would not exist without the universe, and so the
*process of thought*is a conversion of*abstractions expressed in the physical universe*to*abstractions expressed in human language*. Corollary: the quest to find a “theory of everything” is a quest to find the language of nature.

I emerge from this thought process with a better understanding of concepts around abstraction, expression, information, and language. I know that `Theory of Computation`

is a standard part of Computer Science programs and I look forward to perhaps diving into this topic further some day. Many thanks to Miko Mynttinen – the friend in question who got me thinking about this.

I like to end posts like this with quotes, so here’s a fitting one from Carl Sagan:

If you wish to make an apple pie from scratch, you must first invent the universe. 🥧 = 🌌