A Computational Experiment in the Laws of Form of Spencer-Brown and the Sociology of Luhmann

José Javier Blanco Rivero

Introduction

This notebook has been setup with the purpose of doing a computational exploration of the formal calculus of George Spencer-Brown and the systems theory of Niklas Luhmann, following the ideas and concepts discussed in our article Systems Theory, the Calculus of Form and Logic Programming. However, it is also intended to stand as an independent and reproducible research piece.

George Spencer-Brown in his book Laws of Form (1969) (LoF) proposed a formal calculus based on Boolean Algebra, many of whose corollaries have had interesting consequences (or interpretations) for the philosophy of science, systems theory, philosophy of mathematics, theory of signs or semiotics, logics and many others fields where his ideas have been adapted.

Niklas Luhmann, on the other hand, was a German sociologist who challenged traditional notions in sociological theory, such as the concept of subject, by recurring to new ideas and conceptions taken from fields as diverse as organization theory, phenomenology, mathematics, biology, cybernetics, systems theory, cognitive science, among others. He managed to engineer a theory of society based on the pillars of systems theory, evolution theory and communication theory. The result is a modern universal theory of society that is able to deliver insightful analysis of contemporary society and its evolution.

The ideas of Spencer-Brown such as distinctions, forms, re-entry, marked and unmarked state, turned out to be a proper fit for Luhmann's late oeuvres, since these concepts provided him with pretty much of the conceptual apparatus he employed to describe the functional systems of society (politics, law, art, economy, mass media, science, among others).

Let us briefly sketch what is LoF about.

Basically, LoF develops a primary algebra, a primary arithmetic and a second order algebra, starting from a small set of tokens and a pair of transformation rules. The first token is the cross and the second is the empty space . The cross leaves a mark in space producing, as a consequence, a distinction. This space will be called the marked state. A form is the whole conformed by the marked state along with the space left virgin, the unmarked state. From here the English mathematician derives two laws —namely, the law of calling and the law of crossing— on whose back the whole building of LoF will be built.

LoF was inspired by a simple idea: if algebra can recur to complex numbers in order to solve otherwise unsolvable problems, why is it not possible for an algebra of logic to avail of a similar resource? As a consequence, Spencer-Brown developed in his second order equations the idea of time as an imaginary space, allowing the transition from the marked state of the form to the unmarked state.

It is not for nothing that the eleventh chapter is the most exciting part of LoF, for there Spencer-Brown deals with: a ) the notion of self-reference, under the guise of the re-entry of the form in the form; b) imaginary space, imaginary values and time; and c) oscillation, memory and modulator functions —ideas that have had a warm welcome in constructivists circles.

The motivation behind this notebook is to explore the possiblity of writing some kind of computer software, that would assist sociological systems research. In order to reach our goal, it is crucial to examine if the ideas of LoF can be implemented in a programming language and to think of a proper design for our software. It will be suggested that some kind of expert system might be the best choice.

For this purpose we will explore logic and functional programming in Clojure. The reason of this choice is that both logic programming and LISP, are closely related to formal systems, logic and Artificial Intelligence research (although nowadays Python seems to rule, LISP dominated AI research for more than 30 years), and in this sense, both are closely related to LoF (which is a formal system) and social systems theory (which is a theory of society based on a self-referential formalism).

Certainly, the experiments we are about to carry out are reproducible in any other programming language and with help of any other rule engine or relational library. In fact,the library we are about to use, namely miniKanren, was originally written in Scheme and there are currently many implementations in the most important comtemporary programming languages.

If you are new to Clojure, please check out practicalli, lambdaisland, braveclojure and lispcast among others. So let us begin.

First, let us set up our programming environment. Below, our dependencies are being declared and then we initialize our namespace in order to start coding.

{:deps {org.clojure/clojure {:mvn/version "1.10.1"}
        ;; complient is used for autocompletion
        ;; add your libs here (and restart the runtime to pick up changes)
        org.clojure/core.logic {:mvn/version "1.0.0"}
        com.rpl/specter {:mvn/version "1.1.3"}
        compliment/compliment {:mvn/version "0.3.9"}}}
Extensible Data Notation
(ns logicprogramming.spencer-browns-LoF
  (:refer-clojure :exclude [==])
  (:require [clojure.core.logic :refer :all]
            [clojure.core.logic.fd :as fd]
   [clojure.core.logic.pldb :refer [db with-db db-rel db-fact]]
   [com.rpl.specter :as s])) 
0.1s

Before stepping into formal calculus, let us do a brief warm up in logic programming. The main idea of logic and relational programming is that a knowledge base is somehow declared, then some rules or facts are established, and finally the user perform queries on that database. The logic engine will infer the answer employing the provided rules and return that answer. It is also possible to define relations in situ and ask the logic engine for an answer.

Let us start with something simple:

(run* [q]
  (== q q))
0.1s

So, what just happened? We told the logic engine to run a query and to return all possibilities (run*), in other words, we just set a goal for the logic engine to try to achieve. Next, we declared a logic variable (lvar) q that will be used to store the result of the query. For q to give us back some information, it must be unified either with a bounded lvar, some data structure or a function. In this case, q is being unified with itself, and since q is fresh for it has not been bounded to any value, the logic engine returns the token _0 meaning that the lvar is fresh. These are called reified variables.

Let us do something a little more difficult. Given a list of numbers, let us ask for its head and its tail.

(run* [q]
   (fresh [a d]
      (== [1 2 3 4 5] (llist a d))
      (== q d)))
0.1s
(run* [q]
  (fresh [a d]
    (== [1 2 3 4 5] (llist a d))
    (== q a)))
0.1s

There are many things occuring here. First, the word fresh is a function that allow us to introduce new lvars, so we placed these new lvars into the built-in function llist. This function defines a relation where the first element is the head of a list and the second element its tail. Then we unify this function along with the fresh variables in the corresponding place with a list of five numbers. Finally, we unify q with either a or d depending if we want to get the head or the tail of the list.

Why don't you try it yourself? Click on Remix on the top menu on the right and you will get a copy of this notebook of your own that you can experiment with. Once it is done, complete the code snippet below by adding either a number, a symbol (e.g. 'Luhmann), a string (e.g. "Luhmann") or a keyword (e.g . :Luhmann):

(run 1 [q]
  (== q 1 ;; ERASE THE NUMBER AND INSERT YOUR INPUT HERE
       ))
0.0s

Now click on the play button on the top right of the code cell (if the runtime shutdowns you will need to click on the dropdown-menu next to the play button on top and next click on 'Run all').

As you can see you will simply get the value introduced returned. You are unifying a value to the lvar q.

Now set up a vector (namely, a data structure represented by a pair of brackets [ ]) containing values of any type and feed it to the following function and ask for its head and alternatively for its tail:

(run* [q]
   (fresh [a d]
     (== [];; INSERT YOUR VECTOR HERE
        (llist a d))
     (== q a ;;ERASE THE a AND INSERT HERE a or d whether you want to recover the head or the tail
)))
0.1s

Please, feel free to check out the clojure.core.logic wiki and explore its API. Here is a cell for you to try things out:

(run* [q]
  )
0.1s

Have in mind that the logic programming mindset requires you to think of your domain problem in terms of relations. A function consists in defining one or more relationships between lvars.

Now we are ready to explore the initials of calculus (it is recommended to have a copy of LoF at hand).

The Initials of Calculus and the Primary Arithmetic

Before going any further, it is necessary to make a brief note on notation. From here on brackets [ ] will be used to stand for crosses and empty lists '( ) to represent void or emptiness.

With this in mind, let us think of the law of calling and the law of crossing. The first one states that <<the value of a call made again is the value of the call>>, while the second one states that <<the value of a crossing made again is not the value of the crossing>>. Calling can be expanded into the steps of condensation and confirmation while crossing into the steps of compensation and cancellations. These steps mean nothing else that performing an operation in one direction and then reversing it. Together, these steps describe the contraction and expansion of reference.

The following is a possible implementation of this idea.

(defne reference_expansion
  [x l]
  ([[a a . resto] [a . resto]])
  ([[a [a]] '()]))
0.0s

We have just defined a function that receives two parameters and compares them against two patterns. The first pattern describes a list where its first two elements are equal. In this case it will return the list dropping the repeated element. This is intended to represent the movements of confirmation and condensation.

In the second pattern there is a form represented as a nested structure. If this pattern is found, an empty list representing the unmarked state will be returned. This is intended to represent the steps of cancellation and compensation.

What is exciting about pattern matching is that it fits nicely within our domain problem, since it allow us to describe and relate forms, further, it allow us to calculate forms.

Let us see how this function works:

Condense

(run* [q]
  (reference_expansion '(Luhmann Luhmann Spencer-Brown Kaufmann) q))
0.1s
(run* [q]
  (reference_expansion (flatten '[information [information form]]) q))
0.1s

It is possible to pass nested structures to the function, however it is necessary to call the function flatten in order to unnests the inner vector and return a single vector. Otherwise an exception will be thrown.

Let us resume:

Confirm

(run* [q]
  (reference_expansion q '(marked unmarked)))
0.1s

Cancel

(run* [q]
  (reference_expansion ['a ['a]] q))
0.0s

Compensate

(run* [q]
  (reference_expansion q []))
0.1s

Something interesting just happened here. We are getting a reified variable. Why? Before trying to answer this question, let us first try something slightly different.

Alternatively, we could have defined a function to describe the domain of number (condensation and confirmation) and the domain of order (cancellation and compensation). This might be more faithful to the intention of the author.

(defne number
   [x y]
   ([[ [] [] ] []]))
0.0s
(defne order
   [x y]
  ([[ [] ] '()]))
0.0s

Again, we are using pattern matching in order to represent LoF expressions. Let us try them:

(run* [q]
  (number q []))
0.1s

That was confirmation. Now let us try condensation:

(run* [q]
  (number [[] []] q))
0.1s

Now let us see how compensation and cancellation work:

(run* [q]
  (order q '()))
0.1s
(run* [q]
  (order [[]] q))
0.1s

Pattern matching simplifies a lot the interpretation of LoF, for it is only needed to translate LoF notation into Clojure, and specifically, into core.logic data structures. The theorems of variance and invariance can be represented by the following functions:

(defne invariance
   [x y]
   ([[ a [a]] '()]))
0.0s
(defne variance
   [x y]
   ([[[p r] [q r]] [[[p] [q]] r]]))
0.0s
(run* [q]
  (invariance [1[1]] q))
0.1s
(run* [q]
  (invariance q '()))
0.1s

Again we find ourselves with reified variables. It seems that in absence of an appropriate context (an environment or closure) the function can deduce the form but cannot remember the data it used to hold.

What consequences has this finding for LoF? We shall address this issue later on, but before feel free to try any of functions above or if you feel confident enough try one of your own:

(run* [q]
  )

The Primary Algebra

When introducing the primary algebra, Spencer-Brown explains that in this context the sign = obeys two rules, namely, the rule of substitution and the rule of replacement. Then he proceeds to define the two initials of algebra, namely, position and transposition.

Since both of them are practically equivalent to invariance and variance respectively, they won't be defined again. So let us define the functions of reflexion, generation and integration.

(defne reflexion
  [x y]
  ([[[a]] [a]]))
0.0s
(run* [q]
  (reflexion [['information]] q))
0.1s
(run* [q]
  (reflexion q '[information]))
0.1s
(defne generation
  [x y]
    ([[[a b] b] [[a] b]]))
0.0s
(defne integration
  [x y]
  ([[[] a] []]))
0.0s
(run* [q]
  (integration [[] 'money] q))
0.1s
(run* [q]
  (integration q []))
0.1s

Once again, our logic engine is able to return the structure that should fit the data, but cannot deduce or remember the latter, for it has not been stored anywhere. There can be many ways to interpret this, but if LoF is read literally it is reasonable to think that maybe Spencer-Brown was assuming too much when thinking of reversing an equation from emptiness into something. We can call this information loss or entropy.

On the other hand, if the function is provided with an appropriate closure, it will return the expected value:

(run* [q]
   (fresh [a]
      (== a 1)
      (== q [[] a])
      (integration q [])))
0.1s

Next, the rest of the functions representing the consequences of the algebra will be implemented.

(defne iteration
  [x y]
  ([[a a] a]))
0.0s
(defne extension
  [x y]
  ([[[[a] [b]] [[a] b]] a]))
0.0s
(defne echelon
  [x y]
  ([[[[a] b] c] [[a c] [[b] c]]]))
0.0s
(defne crosstranspose
  [j i]
  ([[[[b] [r]] [[a] [r]] [[x] r] [[y] r]]  [[[r] a b] [r x y]]]))
0.0s
(defne mod_transposition
  [x y]
  ([[[a] [b r] [c r]] [[[a] [b] [c]] [[a] [r]]]]))
0.0s

Feel free to try something out:

(run* [q]
  )
0.1s
(defne something
  [x y]
  ([a b]))
0.0s

Second Order Equations

According to Spencer-Brown, a re-entry consists in seeing an infinite expression of the kind echelon, as though it is <<re-entering its own inner space at any even depth>>. In his view, re-entries introduce indeterminacy into the calculus of indications, forcing it to imagine new solutions.

Let us explore the idea of re-entry by playing with some nested data structures.

(def embedded-map {:a
                   {:a {:a "a"
                        :b "b"}
                    :b {:a {:a "a"
                            :b "b"}
                        :b {:a "a"
                            :b "b"}}}
                   :b {:a
                       {:a
                        {:a "a"
                         :b "b"}
                        :b {:a "a"
                           :b "b"}}
                       :b {:a {:a "a"}
                           :b {:b "b"}}}})
0.0s

Given the latter map, we are going to try to get some of its elements. For this purpose we will employ the built-in function get-in and some functions of the library specter.

(get embedded-map :a)
0.0s
(get-in embedded-map [:a :a :a])
0.0s
(get-in embedded-map [:b :a :b :a])
0.0s
(get-in embedded-map [:b :b :b :b])
0.0s

Now let us try some specter functions:

(s/select s/MAP-VALS embedded-map)
0.0s
(s/select-one (s/keypath :a :b :a) embedded-map)
0.0s
(s/select* (s/keypath :a :b :a) embedded-map)
0.0s

If you want to deepen into specter, try this.

Now let us try with another data structure, such as a vector:

(def embedded-vector [true false
                      [true false]
                      [true false
                       [true false
                        [true false
                         [true false
                          [true false]]]]]
                      [true false]
                      [true false
                       [true false]]])
0.0s
(map-indexed (fn [idx itm] (when (even? idx) itm)) embedded-vector)
0.0s
(s/select [s/INDEXED-VALS] embedded-vector)
0.0s

Try something for yourself:

;;INSERT YOUR CODE HERE
0.0s

Not only these exercises give us an insight into what a re-entry means, but also show us how tricky can be to deal with nested data structures.

Let us resume.

If we keep recurring to pattern matching, the function of re-entry can be defined as follows:

(defne re-entry
   [x y]
   ([[a [b]] [[[[a] b] a] b]]))
0.0s

Nonetheless, this function does not allow us to deal with nested structures. The data structures passed as input must match exactly in order to get a result back.

Spencer-Brown is really pointing at a recursive function that when applied to a distinction returns it re-entered. In principle, a map function seems an obvious choice, nevertheless, dealing with a deeply and potentially infinite nested structure makes this a non-trivial problem. The most common way to think of nested structures are trees, so a tree data structure might be of much help. Following Spencer-Brown, a re-entered form can be represented like an echelon:

(def rform  '[[[[[[[[a] b] a] b] a] b] a] b])
0.0s

Let us create a zipper in order to navigate this structure. First, let us require the library:

(require '[clojure.zip :as z])
0.1s

And now let us turn rform into a zipper:

(def rexpr (z/vector-zip rform))
0.0s

In order to navigate this data structure we must explicitly tell the program where to go:

(-> rexpr
     z/down
     z/down
     z/node)
0.0s

We asked Clojure to go down two levels and then return the node at that position. Since this can get repetitive and we know that in the structures of the kind echelon we always need to go deeper, it makes sense to write a macro to save us some work:

(defmacro re-enter
  [distinction form times]
  `(-> (z/vector-zip ~form)
       ~@(take ( - times 1) (repeat z/down))
       (z/insert-child ~distinction)
       z/root))
0.1s

It asks for three parameters: a distinction to be entered, the form or expression to be re-entered and the number of times the function down will be executed. This means that the exact depth of the data structure must be known or else an exception will be thrown.  The following function will solve this problem for us:

(defn get-depth
    [form]
     (->> (tree-seq vector? seq form)
          (remove coll?)
          (count)))
0.0s

Nonetheless, there is a caveat with macros. Since they are pre-compiled, certain common functional programming techniques are not allowed. In this case, it is not possible to pass the function get-depth to re-enter, so we must remember the depth of the form and pass the raw number as an argument. For instance, knowing in advance that our structure has a depth of eight:

(re-enter '[[a] b] rform 8)
0.0s

It can be read as <<re-enter the distinction '[ [ a ] b] into the the form 'rform at depth 8>>.

Spencer Brown also refers to oscillator, memory and modulation functions. Regarding the first, he explains that it consists of re-entering a distinction at every odd space. In respect to the second, he says that they consist in re-entering a form at every even space. And the last one refer to expressions with several re-entries.

Let us create an infinite sequence of boolean values in order to emulate this functions.

(def bool-seq (iterate (fn [x] (cond
                                 (true? x) false
                                 (false? x) true)) true))
1.2s
(take 15 (take-nth 3 bool-seq)) ;;odd means oscillation
0.0s
(take 15 (take-nth 2 bool-seq)) ;;even means memory
0.0s

Unfortunately, not many interesting things can we deduce from these functions.

Do you have any idea?

;; INSERT YOUR CODE HERE
0.0s

Towards a software for sociological system theoretic research

As suggested above, thinking of some sort of query engine might be appropriate as a tool assisting sociological system research. If a distinction is to be conceived as a connection, as Schönwälder-Kuntze et alia suggests, it makes sense to represent that distinction as a logical function that establishes a relationship between a token (indication) and a set of tokens with which it might relate (distinction) or even  maybe between a concept and a semantic field.

Suppose our researcher is interested in discovering paradoxes in a corpus of social discourses collected for a period of time and ranging from politics to science, from economy to art, and so on. So let us define the relation form as follows:

(db-rel form x [a b & args])
0.0s

The relation form describes an element x vinculated to a vector of at least two elements.

Once one or many relations are defined, it is required to define some facts with the intent of establishing some constraints among the relationships already declared.

(def concepts->contexts
  (-> (db)
   (db-fact form 'information '[not-information power technology redundancy meaning cognition power])
   (db-fact form 'power '[government opposition money public-affairs justice violence sex])
   (db-fact form 'money '[power economy crisis stagnation inflation deflation commerce market])))
0.0s

Finally, a query is run against our knowledge base:

(run-db* concepts->contexts [q]
  (form 'information q)) 
0.1s

Let us try something harder, like getting the marked states where the concept of power arises in their respectives contexts:

(run-db* concepts->contexts [q]
    (fresh [a b c]
       (form a b)
       (membero 'power b)
       (== q a)))
0.3s

Let us try to define a self-referential structure:

(def concepts->contexts2
  (-> (db)
   (db-fact form 'information '[not-information power technology redundancy meaning cognition power])
   (db-fact form 'power '[government opposition money public-affairs justice violence sex])
    (db-fact form 'money '[power economy crisis stagnation inflation deflation commerce market])
    (db-fact form 'art '[(form 'art [(form 'art [_])])])))
0.0s
(run-db* concepts->contexts2 [q]
   (== 'art q))
0.1s
(run-db* concepts->contexts2 [q]
  (== q 'art))
0.1s

Here we've got self-reference.

If trees turn out to be the most suitable data structure for our problem, it might be possible to leverage some of the pattern matching functions defined above. Think of a semantic field represented as a tree hierarchy:

(def semantic-tree
   [ [ [:concept-a] [:concept-y] ] [ [:concept-b] [:concept-y] ] 
    [ [:concept-c] :concept-y] [ [:concept-d] :concept-y] ])
0.0s

In this case, :concept-y stands for the root of the tree. As it turns out, this structure is equivalent to consequence 9, namely, crosstransposition:

(run* [q]
   (crosstranspose semantic-tree q))
0.1s

Final Remarks

The main purpose of this notebook has been to try out some ideas, specifically, logic programming and formal calculus inspire us to think of some kind of software tooling that can assist in sociological research, concretely, in systems theoretic Luhmanian sociological theory. In this regard it is too early to draw any conclusion. On the contrary, much remains to be done.

Nevertheless, there are some interesting findings to highlight:

  • Pattern matching algorithms seem well-suited for representing forms. It is of most interest to deepen into the theoretic, philosophic and pragmatic implications of this relationship.

  • Sometimes theoreticians take too much for granted and reproducible research of the kind we are trying to pursue here, can be of much help in keeping us critical of our own assumptions. In this case, it has been shown that although forms can be kept, information cannot escape the arrow of time, in other words, it is not possible to recover information from void.

  • Understanding distinctions as relationships or connections might constitute a prolific conceptual bridge between computer science interest in knowledge representation and sociological interest in semantics, codification, and culture.

Runtimes (1)