top of page

Physicalist and Mathematical-Eliminativist Conceptions of Semantic Information

I have been following the publications of physicist and philosopher Carlo Rovelli for some years. I only recently came across this one about physical information and meaning, which is excellent.

In my own paper Information is Alethically Neutral and Intrinsically Semantic there is coverage of the many of the same metaphysical questions Rovelli emphasises.

Noise. The opposite of information according to Shannon's theory.

Rovelli's paper is very close in some of its underlying metaphysics to many observations made in my still unsubmitted thesis. My own thesis is precisely about a physicalist metaphysics of information: a physicalist metaphysical theory of the nature of information.

I have introduced in my thesis (since at least 2013) the concept of information as the physical-causal inducement of the configuration of physical structure (or of structure that existentially depends upon and reduces to physical structure). I also deploy in it and elsewhere the concepts of the veracity of a signal state and ubiquity of natural information.

Rovelli deploys an interesting use of Wolpert's formulation (to which I do not defer) of semantic information .

My entire thesis was stolen from my bag in a library on a thumb drive about 6 months ago. Another was lost on a plane back from Thailand in 2014. Odd publishing mechanism - I know :) I am also fairly sure that email at my university is not overly secure. It's a gas. That's putting aside the nature of academic philosophers as selective information channels (if you share a good original idea, you can be sure that it will get passed off as someone else's, with the lame excuse that everyone does that and ideas are easy to have: both of which assertions are rubbish).

This paragraph from Rovelli's paper could almost be straight out of my existing thesis (from 2014):

"Intentionality is built into the definition. The information here is information that the system A has about the variable y of the system B. It is by definition information “about something external”. It refers to a physical configuration of A (namely the value of its variable x), insofar as this variables is correlated to something external (it ‘knows’ something external)."

However, I depart with Rovelli on the Darwinian evolutionary and correlative organismic-functional and basis for semantic content:

"The definition separates correlations of two kinds: accidental correlations that are ubiquitous in nature and have no effect on living beings, no role in semantic, no use, and correlations that contribute to survival. The notion of meaningful correlation captures the fact that information can have “value” in a darwinian sense. The value is defined here a posteriori as the increase of survival chances. It is a “value” only in the sense that it increases these chances".

I take this to reduce to what Rovelli calls direct semantic information (I think). Rovelli thinks that the semantic content of information has to be determining by evolutionarily determined function. It is a very interesting approach indeed because it introduces evolutionary theory not just for a biological conception of information, but for a general conception of the nature of information and the meaning thereof.

Is the information in Stem cell DNA the same thing as the information scanned by an electron tunneling microscope? Mathematism about information says yes.

It is therefore what I would call a subjectivist/agentive, functionalist, and evolutionary conception of information, and although I do not think that it is right as a general conception of information and the semantic content thereof, I think it has promise for adaptations of what is called infotel semantics in the philosophy of information in molecular bioscience specifically, which conception regards that information in protein synthesis and DNA transcription is realised only when there are consumers of the structures that are supposed to express or encode the information.

There have been other attempts in the increasingly vast literature on semantic theories of information to identify what how information gets its meaning, but of course they largely depend in the first place on the conception of the nature of information that the theorist has. One of the most interesting offerings is that of Peiter Adriaans, who has argued what I call a mathematical conception of semantic information. His view is that we do not need to add any further metaphysical apparatus to the conception of mathematical information to is identify a conception of meaning and semantic content for information.

Adriaans' view is quite sophisticated. It is mathematical eliminativism about semantic information: he eliminates the need for a theory of semantic information by asserting that an interpretation of information theory and specifically the MDL (minimum drescription length) principle, according to which there is a quantitative measure for the minimal length of description of a program for conveying information.

"Specifically, I will defend the view that notions that are associated with truth, knowledge, and meaning all can adequately be reconstructed in the context of modern information theory and that consequently there is no need to introduce a concept of semantic information...One of the most elegant results of information theory in the past decades is the so-called minimum description length principle (Mitchell 1997; Grünwald 2007). It combines elements from statistics, Shannon information, and Kolmogorov complexity to formulate a strategy for optimal model selection...The derivation in fact tells us that if we want to find the best theory to explain a set of data then we have to find the theory that compresses optimally: the sum of the optimal Shannon coding of the theory in bits plus the optimal Shannon coding of the data given the theory in bits. If we let our possible models range over the class of all finite computer programs then the optimal theory can be expressed in terms of Kolmogorov complexity."

I will not expand on Adriaans' non-apodictic (one does not come to know something for certain r not know it) and mathematical conception here. What is important is that he thinks that mathematical structures such as the MDL and Kolmogorov's measure of complexity for a sequence of symbols provide a notion of truth and approximate knowledge that means that information - the way that he conceives of it mathematically - is intrinsically meaningful.

One thing that these two very different conceptions of how information exists and how it has semantic content reveals the motivation for eliminativist and nominalist conceptions of information and semantic information. If there are so many definitions, perhaps it is simply that there is no such thing in the world/universe as information (eliminativism about information itself) or else that the term 'information' is just one that scientists apply to many different measures, formulae, quantifications, dynamics, physical principles, and structures (nominalism).

Links of Interest:

bottom of page