- Dr Bruce Long and Professor Pieter Adriaans
Dr Bruce Long interviews eminent philosopher of information and IIMx advisory council luminary, Professor Pieter Adriaans, about his work and his view of the philosophy of information today.
In the 1930s, Andre Kolmogorov developed the four axioms of statistics and even earlier R.A. Fisher developed a measure of information based upon what data could reveal about an unknown statistical parameter. In the military industrial complex during and after WWII and during the early Cold War period, communications engineer and statistician R.V.L Hartley, cybernetics prodigy Norbert Wiener, Alan Turing, and applied statistical genius Claude E. Shannon initiated the early stages of what is now called the information age. Within the next 20 years, the transistor-based digital computer and The Internet would be born. These advances were accompanied by Crick and Watson's 1971 discovery of the DNA sequence, and the ongoing development of database technologies. In 1984, the world wide web arrived.
These technological and scientific leaps were also accompanied by revolutionary advances in logic. In the early 1960s young logician by the name of Saul Kripke fixed all of the problems that existed with Aristotelian syllogisms and revolutionised logic with a formalisation of what is now called modal logic, or possible worlds logic. By the early 1980s logicians were already trying to understand how the concepts of information flow and transmission integrated with existing conceptions of deductive inference, or how conclusions are informed by premises, in information-theoretic terms. Meanwhile molecular bioscientists were trying to understand the role of information dynamics in DNA translation and transcription, morphogenesis, protein synthesis, and inheritance.
From the 1950s onward, philosophy, having long since handed the mantle of natural philosophy to science, and having become what W.V.O Quine called 'the handmaiden of science', was strongly influenced by the effects of Turing and Shannon's discoveries, by information and computer technology on the concept of knowledge, and by the informational turn in logic. The informational turn in philosophy had begun by 1950, and is increasingly popularly dated from Rudolph Carnap and Yehoshua Bar-Hillel's efforts to develop a theory of semantic information based upon Shannon's quantitative mathematical theory of communication and Carnap's logical probability theory.
I interviewed one of the most prolific and inventive scholars of the period of the maturing of the informational turn in philosophy during the 1980s through the 2000s, Professor Pieter Adriaans. I asked him what he thought of the state of the art in the philosophy of information and the informational turn. Here is that brief, but very interesting interview (with some excellent bonus video material courtesy of Professor Adriaans).
- Dr Bruce Long
Interviewer
Please tell us a little about your background as a philosopher and philosopher of information.
PA
I studied philosophy (and a bit of mathematics and physics) in Leiden from 1976-83, with a very broad scope: epistemology, metaphysics, logic, phenomenology, analytical philosophy, classical and medieval philosophy etc. etc. I had very good teachers such as Kees Van Peursen for theory of knowledge, Gabriel Nuchelmans for analytical philosophy and Bertus de Rijk for medieval philosophy. For a while I was part of the “young philosophers gang” around the now legendary Heidegger specialist Rob van Dijk. I soaked it all up like a sponge. Marcel Fresco introduced me to neo-kantian philosophy and the work of the Dutch philosopher/poet Johan Andreas dèr Mouw, which has been a lifelong inspiration.
Forced by economic conditions I had to find work in the emerging computer business, which I found an interesting playground for “applied" philosophy. Not happy with the state of logic research in Leiden I decided to move to Amsterdam, where I was lucky enough to land in the middle of a research group that would change logic research fundamentally (then called ITLI, now ILLC) led by a young brilliant logician Johan van Benthem. I decided to do a Phd in theoretical computer science, because I felt that as a philosopher I lacked the formal training to really contribute to the emerging research domain. Again I was lucky that one of the cleverest guys in the field, Peter Van Emde Boas wanted to be my supervisor. He shared a room with Paul Vitányi, who had just rediscovered Kolmogorov complexity. In the end my thesis dealt with the application of algorithmic complexity theory to language learning. My basic research questions have not changed over the last 40 years: Why is the world learnable? What is learning?
Interviewer
Johan van Benthem has been very influential and extremely prolific in his scholarly output. Could you tell us a little more about your work with him?
PA
The ILLC was boiling with hew ideas about logic and computation at the end of the eighties: substructural logics, categorial logic, lambda calculus, object oriented computing, logic programming, dynamic logic, game theory.... At first Johan, in the center of this whirlwind of ideas, was a sort of unapproachable demi-god for me. After I had sold my AI company Syllogic in the 1997 I was offered a chair in machine learning in Amsterdam and we gradually developed a work relationship. When Johan and I started to ponder about the Handbook of Philosophy of Information, now about 20 years ago, we wanted to give the subject a solid grounding in logic and computer science. We organized a seminal workshop in Amsterdam in 2005 where a lot of the founding fathers of the subject were present, amongst others: John McCarthy, Fred Dretske and Luciano Floridi. We had invited Ray Solomonoff but he did not feel fit enough to come. This was the basis for the Handbook.
Looking back I think that we have missed the connection with most of the philosophical community, simply because the material is so complicated for non-specialists. What has happened is that the philosophers have filled the gap between classical philosophy and solid formal study of the concept of information with their own interpretation of the subject, which certainly has its value, but which I see mainly as a reprise of classical epistemological and metaphysical problems in a newer setting, a bit like reinventing the wheel.
Interviewer
What do you think of the state of the philosophy of information and the so called ‘informational turn’ in philosophy now? There is a lot of work in the metaphysics of information, for example. What do you think about the status of this subdiscipline?
PA
The problem is that this research only has a superficial connection with existing research in the sciences. Reading the majority of current papers on philosophy of information one feels like stepping in to a time machine transposing us to the mid-20th century. We read about semantic information, false information, sending messages, entropy etc. but none of the recent important developments like NP-completeness, algorithmic complexity, quantum computing, the anthropic principle, quantum gravity etc. etc. are touched upon. When young philosophers try to write about these issues, in most cases, they lack the formal training to make a valid contribution, which generates a waste of a lot of talent. The set of papers I have refereed and had to reject in the past decennium is almost endless. The result is that much of what is currently promoted as philosophy of information has only limited value in my view, simply because it is not rooted in deep understanding of the underlying problems. The big gain of the informational turn is that the issue is put on the philosophical agenda. Information is without doubt a central category both in the sciences and the humanities, but so far we have only scratched the surface. I think there is room for an organization whose ambition it is to promote such research in a well-founded setting.
Interviewer
What do you think the best way forward is for the philosophy of information?
PA
One of the bigger open problems is the interaction between information and computation. How do processes generate information? One can teach a class on computers without mentioning the word information and vice versa. That can’t be right. There seems to be a big gap in our understanding. This has been the focus of my research en the past years. The problem exists in classical computation, at quantum level in the physics of black holes and on the simple every day cognitive level of problem solving and artistic creativity.
My ambition would be simply to promote philosophical research that is in constant dialogue with developments in physics, mathematics, logic and computer science at the highest level, with the intention to gain a better understanding of the issues. I think that philosophy has a lot to offer in efforts to bridge such gaps, especially now in the 21st century when problems of consciousness, biology of the cell, creativity, nanotechnology, quantum physics, complexity, AI, neuroscience, bio-diversity, resilience, gravity, feasible computation etc. seem to be harder and more wonderful than we ever imagined before.
Public lecture by John McCarthy at the 2005 workshop:
Public lecture by Keith Devlin at the 2005 workshop:
Comments