“Syntactical and Semantic Information Systems” by Reginald T. Cahill
Excerpted from: PROCESS PHYSICS: From QUANTUM FOAM to GENERAL RELATIVITY by Reginald T. Cahill
[The ongoing failure of physics to fully match all the aspects of the phenomena of time, apart from that of order, arises because physics has always used non-process models, as is the nature of formal or syntactical systems. Such systems do not require any notion of process – they are entirely structural and static. The new process physics overcomes these deficiencies by using a non-geometric process model for time, but process physics also argues for the importance of relational or semantic information in modelling reality. Semantic information refers to the notion that reality is a purely informational system where the information is internally meaningful. Hence the information is ‘content addressable’, rather than is the case in the usual syntactical information modelling where the information is represented by symbols. This symbolic or syntactical mode is only applicable to higher level phenomenological descriptions, and for that reason was discovered first.
A pure semantic information system must be formed by a subtle bootstrap process. The mathematical model for this has the form of a stochastic neural network (SNN) for the simple reason that neural networks are well known for their pattern or non-symbolic information processing abilities . The stochastic behaviour is related to the limitations of syntactical systems discovered by Gödel  and more recently extended by Chaitin [9, 10, 11], but also results in the neural network being innovative in that it creates its own patterns. The neural network is self-referential, and the stochastic input, known as self-referential noise, acts both to limit the depth of the self-referencing and also to generate potential order.]
In modelling reality with formal or syntactical information systems physicists assume that the full behaviour of a physical system can be compressed into axioms and rules for the manipulation of symbols. However Gödel discovered that self-referential syntactical systems (and these include basic mathematics) have fundamental limitations which amount to the realisation that not all truths can be compressed into an axiomatic structure, that formal systems are much weaker than previously supposed. In physics such systems have always been used in conjunction with metarules and metaphysical assertions, all being ‘outside’ the formal system and designed to overcome the limitations of the syntax. Fig.1 depicts the current understanding of self-referential syntactical systems. Here the key feature is the Gödel boundary demarcating the provable from the unprovable truths of some system. Chaitin, using Algorithmic Information Theory, has demonstrated that in mathematics the unprovable truths are essentially random in character. This, however, is a structural randomness in the sense that the individual truths do not have any structure to them which could be exploited to condense them down to or be encoded in axioms. This is unlike random physical events which occur in time. Of course syntactical systems are based on the syntax of symbols and this is essentially non-process or non-timelike.
There is an analogy between the structure of self-referential syntactical information systems and the present structure of quantum theory, as depicted in Fig.2.
There the formal and hence non-process mathematical structure is capable of producing many provable truths, such as the energy levels of the hydrogen atom, and these are also true in the sense that they agree with reality. But from the beginning of quantum theory the Born measurement metarule was introduced to relate this non-process modelling to the actual randomness of quantum measurement events. The individuality of such random events is not a part of the formal structure of quantum theory. Of course it is well known that the non-process or structural aspects of the probability metarule are consistent with the mathematical formalism, in the form of the usual ‘conservation of probability’ equation and the like. Further, the quantum theory has always been subject to various metaphysical interpretations, although these have never played a key role for practitioners of the theory. This all suggests that perhaps the Born metarule is bridging a Gödel-type boundary, that there is a bigger system required to fully model quantum aspects of reality, and that the boundary is evidence of self-referencing in that system.
Together the successes and failures of physics suggest that a generalisation of the traditional use of syntactical information theory is required to model reality, and that this has now been identified as a semantic information system which has the form of a stochastic neural network.
Fig.3 shows a graphical depiction of the bootstrapping of a pure semantic information system, showing the stochastic neural network-like process system from which the semantic system is seeded or bootstrapped. Via a Self-Organised Criticality Filter (SOCF) this seeding system is removed or hidden. From the process system, driven by Self-Referential Noise (SRN), there are emergent truths, some of which are generically true (ensemble truths) while others are purely contingent. The ensemble truths are also reachable from the Induced Formal System as theorems, but from which, because of the non-process nature of the induced formal system, the contingent truths cannot be reached. In this manner there arises a Gödel-type boundary. The existence of the latter leads to induced metarules that enhance the induced formal system, if that is to be used solely in higher order phenomenology.
Western science and philosophy has always been dominated by non-process thought. This ‘historical record’ or being model of reality has been with us since Parmenides, and his student Zeno, of Elea, and is known as the Eleatic model (c500 BCE). However, nevertheless, and for the dubious reason of generating support for his supervisors being model of reality, Zeno gave us the first insights into the inherent problems of comprehending motion, a problem long forgotten by conventional non-process physics, but finally explained by process physics. The becoming or processing model of reality dates back to Heraclitus of Ephesus (540-480 BCE) who argued that common sense is mistaken in thinking that the world consists of stable things; rather the world is in a state of flux. The appearances of ‘things’ depend upon this flux for their continuity and identity. What needs to be explained, Heraclitus argued, is not change, but the appearance of stability. With process physics western science and philosophy is now able to move beyond the moribund non-process mindset. While it was the work of Gödel who demonstrated beyond any doubt that the non-process system of thought had fundamental limitations; implicit in his work is that the whole reductionist mindset that goes back to Thales of Miletus could not offer, in the end, an effective account of reality. However the notion that there were limits to syntactical or symbolic encoding is actually very old. Priest  has given an account of that history. However in the East the Buddhists in particular were amazingly advanced in their analysis and comprehension of reality. Stcherbatsky , writing about the extraordinary achievements of Buddhist logic in the C6 and C7th CE, noted that;
Reality according to Buddhists is kinetic, not static, but logic, on the other hand, imagines a reality stabilized in concepts and names. The ultimate aim of Buddhist logic is to explain the relation between a moving reality and the static constructions of logic.
In the West the process system approach to reality was developed, much later, by such process philosophers as Peirce, James, Bergson and Whitehead to name a few, although their achievements were very limited and substantially flawed, limited as they were by the physical phenomena known to them. A collection of their writings is available in . Perhaps a quote from Charles Peirce , writing in 1891, gives the sense of their thinking;
The one intelligible theory of the universe is that of objective idealism, that matter is effete mind, inveterate habits becoming physical laws. But before this can be accepted it must show itself capable of explaining the tridimensionalty of space, the laws of motion, and the general characteristics of the universe, with mathematical clearness and precision; for no less should be demanded of every philosophy.
With process physics we have almost achieved this end, and Wheeler has already expressed this notion of inveterate habits as “law without law” . As the astute reader will note the self-referentially limited neural network model, that underpins process physics, is remarkably akin to Peirce’s effete mind. But it is the limitations of syntax, and the need for intrinsic or semantic information ‘within’ reality and at all levels, that reality is not imposed, that drives us to this approach. Einstein, the modern day eleatic thinker, realised all too well the limitations of non-process thinking, but was unable to move out of the non-process realm that the West had created for itself, for according to Carnap ;
Once Einstein said that the problem of the Now worried him seriously. He explained that the experience of the Now means something special for man, something essentially different from the past and the future, but that this important difference does not and cannot occur within physics. That this experience cannot be grasped by science seems to him a matter of painful but inevitable resignation. I remarked that all that occurs objectively can be described in science: on the one hand the temporal sequence of events is described in physics; and, on the other hand, the peculiarities of man’s experiences with respect to time, including his different attitude toward past, present and future, can be described and (in principle) explained in psychology. But Einstein thought that scientific descriptions cannot possibly satisfy our human needs; that there is something essential about the Now which is just outside of the realm of science.