call it mental or psychological time). The velocity ..... We shall call p a moti6ation to change ...... as its center, so such a ball is thus not uniquely characterized by ...
BioSystems 56 (2000) 95 – 120 www.elsevier.com/locate/biosystems
Classical and quantum dynamics on p-adic trees of ideas Andrei Khrennikov Department of Mathematics, Statistics and Computer Sciences, Uni6ersity of Va¨xo¨, S-35195 Va¨xo¨, Sweden Received 11 June 1999; received in revised form 27 December 1999; accepted 17 January 2000
Abstract We propose mathematical models of information processes of unconscious and conscious thinking (based on p-adic number representation of mental spaces). Unconscious thinking is described by classical cognitive mechanics (which generalizes Newton’s mechanics). Conscious thinking is described by quantum cognitive mechanics (which generalizes the pilot wave model of quantum mechanics). The information state and motivation of a conscious cognitive system evolve under the action of classical information forces and a new quantum information force, namely, conscious force. Our model might provide mathematical foundations for some cognitive and psychological phenomena: collective conscious behavior, connection between physiological and mental processes in a biological organism, Freud’s psychoanalysis, hypnotism, homeopathy. It may be used as the basis of a model of conscious evolution of life. © 2000 Elsevier Science Ireland Ltd. All rights reserved. Keywords: p-Adic number representation; Classical information force; Quantum information force; Conscious evolution of life
1. Introduction It seems that the modern physics can in principle explain (or at least describe) all phenomena which are observed in reality: motion of classical and quantum systems, classical and quantum fields, …, physiological processes in biological organisms. This incredible power of physics induced the common opinion that all biological processes could be reduced to some physical processes. This concerns not only primary physiological processes in biological organisms such as, for example, the
This investigation was supported by the grant ‘Strategical investigations’ of Va¨xo¨ University and visiting professor fellowships at University of Clermont-Ferrand and Tokyo Science University.
functioning of the blood system, but even biological processes of the highest level of complexity, namely, cognitive processes. The idea that by studying physiological processes in the brain we could explain (probably after many years of intensive research) the functioning of the brain quickly propagates throughout the biological community (see, for example, Skinner, 1953; Lorenz, 1966; Dawkins, 1976; Clark, 1980, for reductionist psychological theories). Hence it is widely supposed that the phenomenon of the consciousness can be reduced to some (probably still unknown) physical phenomena. Such an idea seems natural and attractive (at least at the present time). However, I do not support this viewpoint. I think that the phenomenon of consciousness will be never re-
0303-2647/00/$ - see front matter © 2000 Elsevier Science Ireland Ltd. All rights reserved. PII: S 0 3 0 3 - 2 6 4 7 ( 0 0 ) 0 0 0 7 7 - 0
96
A. Khrenniko6 / BioSystems 56 (2000) 95–120
duced to ordinary physical phenomena. And the modern neurophysiological activity gives some evidences of this. Here numerous investigations were performed to study the processes of the exchange of electric signals in the brain (see, for example, Eccles, 1974; Amit, 1989) and to study localization of these processes in different domains of the brain (see, for example, Cohen et al., 1997; Courtney et al., 1997, for fascinating experiments (based on functional magnetic resonance imaging machine) for memory neurons configurations or Hoppensteadt, 1997, for the frequency domain models for the exchange of signals in the brain). Nevertheless, despite all of these investigations and the large amount of new information on physical processes in the brain, we now do not understand the phenomenon of consciousness much better than 100 years ago. In the present paper we propose a new physical–mathematical model for the brain functioning (see Khrennikov, 1998a). This model is not based on the modern (Newton – Einstein) picture of physical reality (in particular, we do not use the real space R3 as the mathematical basis of our model). We consider a new type of reality, namely, reality of information. Cognitive systems are interpreted as transformers of information. For transformers of information we develop the formalism of classical mechanics on mental space (space of ideas). In particular, this theory describes evolution of human ideas. The general formalism of classical cognitive mechanics is developed by analogue to the formalism of the ordinary Newton mechanics which describes the motion of material systems. We propose cognitive analogues of Newton’s laws of the classical mechanics. Mathematically these laws can be described by differential equations (on mental spaces)1. Starting with the initial idea x0 we can 1
At first sight it is quite surprising that motions of material systems and mental systems (ideas) are described by the same mathematical equations (Newton or Hamilton equations). The only difference is that these objects evolve in different spaces (Newton real space and mental space, respectively). However, if we consider, instead of the motion of real material objects, the motion of information about these objects, then such a coincidence of equations of motion for material and mental systems would not seem so surprising.
obtain the trajectory q(t) in mental space. However, the classical cognitive mechanics is not obtained as just a copy of the ordinary classical mechanics. First of all in cognitive models the time t (a parameter of the evolution of ideas) could not be always identified with physical time tphys which is used in ordinary physical models. This is internal time of a cognitive system (we can call it mental or psychological time). The velocity 6(t) of the evolution of an idea (calculated as in dq(t) Newton’s mechanics as the derivative: 6(t)= dt has the meaning of the motivation (to change the information state q(t) of a cognitive system). Forces f(t, q) and potentials V(t, q) are information (mental) forces and potentials which are applied to information states of cognitive systems. An information force changes the motivation and this change of motivation implies the change of the information state q of a cognitive system. The mathematical formalization of the classical cognitive mechanics cannot be done in the framework of the real analysis. The real line R and the Euclidean space R3 (and even real manifolds) are not directly related to cognitive information processes. We use another number system, namely, the system of so called p-adic numbers (integers) Zp (see Borevich and Shafarevich, 1966; Schikhov, 1984; Khrennikov, 1994; Vladimirov et al., 1994) as the mathematical basis of our model. Here p\ 1 is a prime number which is the parameter of the model. Mathematical details can be found in Section 7. This section contains also some biological motivations (namely, the ability to form associations) to choose Zp as a mathematical basis of the model. Geometrically we can imagine Z2 as a tree starting with some symbol (a root of the 2-adic tree which can be interpreted as the signal to start the creation of the space of ideas of a cognitive system). This root-symbol generates two branches 0 and 1 (the first level of the tree); each vertex of the first level generates two branches to two new vertices (the second level of the tree). Thus there are now four branches 00, 01, 10, 11. Such a process is continued by an infinite number of steps. As a result, there appears an infinite 2-adic tree with branches x= a0 … which are
A. Khrenniko6 / BioSystems 56 (2000) 95–120 j identified with 2-adic numbers x = j = 0 aj 2 . The 2-adic algebraic structure on this tree gives the possibility for adding, subtracting and multiplying branches of this tree (Fig. 1). We use p-adic trees for prime numbers p only by mathematical reasons (see Section 7). The same information model can be developed for any homogeneous tree with m branches on each level. It is even possible to consider trees such that the number of braches mj depends on the level. The information processes in the brain described by the classical cognitive mechanics are closely connected with neurophysiological processes. Roughly speaking neurophysiology describes ‘hardware’ of the brain and the classical mechanics on mental spaces describes ‘software’ of the brain. Some mathematical models of this software have been presented in Khrennikov, 1997; Albeverio et al., 1998; Khrennikov, 1998b; Albeverio et al., 1999; Dubischar et al., 1999. The models of Khrennikov, 1997; Albeverio et al., 1998; Khrennikov, 1998a; Albeverio et al., 1999. Dubischar et al., 1999, were discrete time models, namely, it was assumed that the time parameter t for the evolution of ideas is discrete: t =0, 1, 2, ,… (thus chains of ideas x0, x1, … were studied in these models). In the present paper we study ‘continuous time’ evolution. On one hand, this gives the possibility to apply (at least formally) the scheme of the standard formalism of the classical mechanics. On the other hand, in the present model we can discuss carefully the meaning of ‘mental time’ (and ‘mental velocity’). The classical cognitive mechanics describes unconscious cogniti6e processes. The phenomenon of
Fig. 1. The 2-adic tree.
97
consciousness cannot be explained by the formalism of the classical cognitive mechanics. To explain this phenomenon, we develop a variant of quantum cognitive mechanics. In this model an idea moves in mental space not only due to classical information forces (which can be in principle reduced to the functioning of the brain’s ‘hardware’ (neurophysiological processes in the brain)), but also due to a new information force, namely, quantum information force. This quantum information force (which will be called a conscious force and denoted by fC (q)) is induced by an additional information potential (quantum potential on mental space or conscious potential C(q)). The C(q) could not be reduced to neurophysiolocal processes in the brain. It is induced by mental processes. The conscious potential C(q) is induced by a wave function C(q) of a cognitive system (by the same relation as in the ordinary pilot wave theory for material systems). In our model this wave function C is nothing than an information field conscious field). In the mathematical formalism this field is described as a function C: Xmen Xmen, where Xmen is a mental space. The evolution of the C-function is described by an analogue of the Schro¨dinger equation on mental space. In fact, our formalism of conscious forces and fields is a (natural) extension of the well known theory of pilot wave (developed by Bohm, 1951; De Broglie, 1964; Bell, 1987 and many others) to cognitive phenomena. Even in the theory of pilot wave for material systems (especially in its variant developed in the book of Bohm and Hiley (1993) the quantum wave function C is merely an information field, but defined on real space R3 of localization of material systems, this field acts to material objects and the problem of information– matter interaction is not clear in this framework. In our model a conscious field (C-function) is associated with purely mental processes and it acts to mental objects, ideas. By our model each classical (unconscious) information state of a cognitive system (the collection of ideas and mental processes in that these ideas are involved) produces a new (non-classical) field, conscious field C. This field induces a new information force fC which induces a permanent
98
A. Khrenniko6 / BioSystems 56 (2000) 95–120
perturbation of the evolution of an idea in the mental space. This C-function is nothing than a human conscious. Of course, our formalism is just the first step to describe the phenomenon of consciousness on the basis of a model of information reality. However, even this formalism implies some consequences which might be interesting for neurophysiology, psychology, artificial intelligence (complex information systems), evolutionary biology and social sciences. Here we present briefly some of these consequences. Flows of cognitive information in the brain (and other cognitive systems) can be described mathematically in the manner which is similar to the classical Newton mechanics for motions of material systems. Therefore the motion of ideas (notions, images) in the brain has the deterministic character (of course, such a motion is perturbed by numerous information noises, see Dubischar et al., 1999, for the details). This motion in mental space is not an evolution with respect to physical time tphys, but with respect to mental time t. Information potentials can connect different thinking processes (in a single brain as well as in a family of brains). The consciousness cannot be induced by a physical activity of material structures (for example, groups of neurons). It is induced by groups of evolving ideas. These dynamical groups of ideas produce a new information field, conscious field, which induces a new information force, conscious force, which is the direct analogue of quantum force in the pilot wave theory for quantum material systems. This conscious force plays the great role in the information dynamics in the human brain (and other conscious cognitive systems). As in the classical cognitive mechanics, in quantum cognitive mechanics conscious potentials can connect thinking processes in different cognitive systems (even in the absence of physical potentials and forces). Therefore it is possible to speak about a collective consciousness for a group of cognitive systems (in particular, human individuals). We also note that different conscious potentials (conscious C-fields) induce conscious forces fC of different (information) strength. The magnitude of the consciousness can be measured (at least theoretically). Thus
different cognitive systems (in particular, different human individuals) may have different levels of the conscious C-field. By our model we need not suppose that a consciousness is a feature of only the human brain. Other cognitive systems (in particular, animals and even nonliving systems) induce conscious C-fields which (via conscious forces fC) control (or at least change) their cognitive behaviours. From this point of view human individuals and animals differ only by the behaviors of their conscious C-fields. As one of applications of our formalism to psychology, we try to explain Freud’s psychoanalysis on the basis of our model as the process of the reconstruction of the conscious field of an individual i having some mental decease via an information coupling of a psychoanalytic p (on the level of a collective C function of the system (i, p)).
2. Classical cognitive mechanics First we recall some facts from Newton’s classical mechanics. In Newton’s model motions of material systems are described by trajectories in space Xmat of localization of material systems2. Thus starting with the initial position q0 a material object A evolves along the trajectory q(t) in Xmat (where t is physical time). The main task of Newton’s mechanics is to find the trajectory q(t) in space Xmat. Let us restrict our considerations to the case in that A has the mass 1 (this can always be done via the choice of the unit of mass). In this case the momentum p(t) of A is equal to the velocity 6(t) of motion. In the mathematical model the velocity dq(t) 6(t) can be found as 6(t)=
q; (t). The vedt locity need not be a constant. Thus it is useful to introduce an acceleration a(t) of A which is the velocity of the velocity . The second Newton law says: a(t)=f(t, q)
(1)
2 In the mathematical model Xmat =R3 or some real manifold.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
where f(t, q) is the force applied to A. As the mass m= 1 and the momentum, p =m6, is equal to the velocity, we have p; (t)=f(t, q), p(0) =p0,
t, q, p Xmat.
(2)
By integrating this equation we find the momentum p(t) at each instant t of time (if the initial momentum p0 is known). Then by integrating the equation q; (t)=p(t),
q(0) =q0,
t, q, p Xmat
(3)
we find the position q(t) of A at each instant t of time (if the initial position q0 is known). We develop the formalism of the classical cognitive mechanics by analogue with Newton’s mechanics. Instead of the material space Xmat, we consider mental space Xmen (see Section 5 for the mathematical model). A cognitive system t is a transformer of information: an information state qXmen (the collection of all ideas of t) is in the process of continuous evolution; t makes transformations q q% q¦ ....The time parameter of this evolution is also an information parameter (mental time of t), t Xmen. Thus the activity of t generates the trajectory q(t) in mental space Xmen. Our deterministic cognitive postulate (which is a generalization of Newton’s deterministic mechanical postulate) is that the trajectory q(t) of the evolution of ideas is determined by initial conditions and forces. As in Newton’s mechanics, we introduce the velocity 6(t) of the changing of the idea q(t). This is again an information quantity (a new idea). It can be calculated as the derivative (in mental space Xmen) of q(t) (with respect to mental time t). We start with development of the formalism for a cognitive system t with the information mass 1. Here we can identify the velocity 6 with the momentum p =m6. We shall call p a moti6ation to change the information state q(t). We postulate that the cognitive dynamics in Xmen is described (at least for some cognitive processes) by an information analogue of Newton’s second law. Thus the trajectory p(t) of the motivation of t is described by equation p; (t)=f(t, q), p(0) =p0,
t, q, p Xmen,
(4)
where f(t, q) is an information force (generated by external flows of information; in particular, by
99
other cognitive systems). Thus if the initial motivation p0 and information force f(t, q) are known, then the motivation p(t) can be found at each instant of mental time t by integration of Eq. (4). The trajectory q(t) of the evolution of ideas can be found by the integration of equation q; (t)= p(t), q(0)= q0,
t, q, pXmen
(5)
(if the initial idea q0 is known). We recall that in Newton’s mechanics a force f(q), qXmat, is said to be potential if there exists a function V(q) such that f(q)= − dV(q)/dq. The function V(q) is called a potential. We use the same terminology in the cognitive mechanics. Here both a force f and potential V are functions defined on the space of ideas Xmen. The potential V(q), qXmen, is an information potential, information field, which interacts with a cognitive system t. Such fields are classical (unconscious) cognitive fields. As we have already mentioned mental time t need not coincide with physical time tphys. Mental time corresponds to the internal scale of an information process. For example, for a human individual t, the parameter t describes ‘psychological duration’ of mental processes. Our conscious experience demonstrates that periods of the mental evolution which are quite extended in the tphysscale can be extremely short in the t-scale and vice versa. In general instances of mental time are ideas which denote stages of the information evolution of a cognitive system. We remark that tphys can be also interpreted as a chain of ideas (about counts n= 1, 2, ..., for discrete tphys and about counts sR, for continuous tphys). In principle, physical time tphys, can be considered as the special representation for mental time t. However, it is impossible to reduce all mental times to physical time (even if tphys is defined up to a transformation, tphys = u(sphys)). Different mental systems, t1, …, tN, (and even different mental processes in a cognitive system t) have different mental times, t1, …, tN. The use of physical time tphys can be considered as an attempt to construct the unique time-scale for all mental processes. However, as we have already mentioned this is impossible. In particular, we could not claim that in general there is an order structure for t. It can be that
100
A. Khrenniko6 / BioSystems 56 (2000) 95–120
instances t, and t2 of mental time can be incompatible. Thus mental time set cannot be imagined as a straight line. The notion of mental time can be illustrated by the following example.
2.1. Example 2.1, reading of a book Suppose that a human individual t is reading a book B on the history of ancient Egypt, E. The process of reading, p, is not continuous; t interrupts p for periods of different duration. Denote by q the state of information of t on E. In principle, the information evolution of t can be considered as an evolution with respect to physical time s =tphys (mechanical clocks): q =f(s). However, the physical parameter s is not directly related to the information process p. For example, the velocity 6s of the information state q with respect to s has nothing to do with the cognitive evolution of the t. Moreover, as a consequence of the jump-structure (with respect to s) of p, 6s is not well defined. Denote by Dr1 =[s0, s1), Di1 = [s1, s2), ... , intervals of reading and interruption of reading. Thus the information process p induces the following split of physical time s: Dr1, Di1, ... , DrM, DiM, .... The intervals Di1, ... , DiM, ... must be eliminated. New time parameter s¯ = f(s) is defined as s¯ = s on Dr1, s¯ =s on Di1, .... The parameter s¯ can be considered as (one of possible) mental (information) scales for the process p. The use of time s¯ essentially improves the mathematical description of p. However, there is still no large difference with the standard physics3. Suppose now that intervals Drk, Dik depend on information that t obtains in the process p: Drk(ak ), Dri (bk ), where ak, bk Xmen are information strings, ‘ideas’. Here s¯ = f(s, c) and q = h(s, c), where sR, cXmen. The next natural step is to eliminate the real parameter s from the description of the information process p and to consider the evolution of the information state q (on the subject E) with respect to a purely mental parameter t. This is information on E which is obtained by t from the corresponding part of B. In the 3 Of course, as f is non-invertible, there are some differences with the standard formalism.
simplest model we can describe t as the text of the book: t= (Ancient Egypt …) (see Section 8 for mathematics). So q= q(t) is a transformation of the information t B into the state of knowledge of t on E. The trajectory g(t)Xmen depends on the initial information stage q0 (on E) of t, the initial motivation p0 of t to perform the information process p and an information force F(t, q) that changes the motivation. For example, if F0 and p0 = 0, then q(t)q0. Thus the reading of the book B does not change the state of knowledge of t on ancient Egypt. This example demonstrates that the information force F(t, q) which ‘guides’ the information state q of t could not be reduced to external information forces f(t, q) (for example, information from radio, TV and other books). There exists some additional information force, fC (t, q), which changes crucially the trajectory q(t)Xmen. If even p0 = 0 and f0 t is totally isolated from external sources of information and initially t has no motivation to change his information state on ancient Egypt), in general q(t) * q0 (the conscious force fC (t, q) can generate nonzero motivation to study this subject). The concrete mathematical representations for mental time t by so called m-adic integers (branches of trees) as well as some other examples will be given in Section 8.
3. Quantum cognitive mechanics, conscious field First we recall some facts on quantum mechanics for material systems. The formalism of quantum mechanics was developed for describing motions of physical systems which deviate from motions described be Newton’s Eqs. (2) and (3). For example, let us consider the well known two slit experiment. There is a point source of light O and two screens S and S%. The screen S has two slits h1, and h2. Light passes S (through) slits and finally reaches the screen S%, where we observe the interference rings. Let us consider light as the flow of particles, photons. Newton’s equations of motion (Eqs. (2) and (3)) could not explain the interference phenomenon: ‘classical forces’ f involved in this experiment could not rule photons
A. Khrenniko6 / BioSystems 56 (2000) 95–120
in such a way that they concentrate in some domains of S% and practically cannot appear in some other domains of S%. The natural idea (see Bohm, 1951; De Broglie, 1964) is to assume that there appears some additional force fQ, quantum force, which must be taken into account in Newton’s equations. Thus instead of Eq. (2), we have to consider perturbed equation p; (t)=f(t, q)+fQ (t, q), p(0),
t, q, p Xmat. (6)
It is natural to assume that this new force, fQ (t, q), is induced by some field C(t, q). This field C(t, q) can be found as a solution of Schro¨dinger equation h (c h 2 ( 2c (t, q)= (t, q) −V(t, q)c(t, q). i (t 2 (q 2
(7)
Thus each quantum system propagates together with a wave which ‘guides’ this particle. Such an approach to quantum mechanics is called pilot wa6e theory. Formally there are two different objects: a particle and a wave. Really there is one physical object: a particle which is guided by the pilot wave4. The C-field associated with a quantum system has some properties which imply that C(q) could not be interpreted as the ordinary physical field (as, for example, the electromagnetic field). The quantum force fQ (q) is not connected dC(q) with C(q) by the ordinary relation f = . The dq ordinary relation between a force f and a potential V implies that scaling V cV, where C is a constant, implies the same scaling for the force, namely, fcf. In the opposite to such a classical relation quantum force fQ is invariant with respect to the scaling C cC the C-function. Thus the magnitude of the C-function (‘quantum potential’) is not directly connected with the magnitude of quantum force fQ. According to Bell (1987) and 4 The pilot wave theory does not give the standard interpretation of quantum mechanics, namely, the orthodox Copenhagen interpretation. By the latter interpretation it is impossible to describe individual trajectories of quantum particles. Probably an analogue of the orthodox Copenhagen interpretation could be also interesting to quantize the classical cognitive mechanics. However, in the present paper we shall concentrate on an analogue of the pilot wave formalism.
101
Bohm and Hiley (1993), C(q) is merely an information filed on material space Xmat. For example, in Bohm and Hiley (1993) C(q) is compared with a radio signal which rules a large ship with the aid of an autopilot. Here the amplitude of the signal is not important, only information carried by this signal is taken into account.5 From the introduction to this paper it is clear how we can transform the classical cognitive mechanics to quantum cognitive mechanics, conscious mechanics. The main motivation for such a development of the classical cognitive mechanics is that behavior of conscious systems cannot be described by a ‘classical information force’ f. Behavior of a conscious cognitive system strongly differs from behavior of unconscious cognitive system (even if both these systems are ruled by the same classical information force f ). Thus our information generalization (4) of the second Newton law is violated for conscious cognitive systems. As in the case of material systems, it is natural to suppose that there exists some additional information force fC (q), conscious force, associated with a cognitive system. This force changes the trajectory of a cognitive system in the space of ideas Xmen. A new ‘quantum’ conscious trajectory is described by equation p; (t)= f(t, q)+ fC (t, q), p(0)= p0,
t, q, pXmen. (8)
The conscious force fC (t, q) is connected with a C-field, a conscious field, by the same relation as in the pilot wave formalism for material systems. An information Schro¨dinger equation (see Section 10) describes the evolution of the conscious Cfield. 5 The pilot wave theory does not give a clear answer to the question: Is some amount of physical energy transmitted by the C-field or not? The book of Bohm and Hiley (1993) contains an interesting discussion on this problem. It seems that, despite the general attitude to the information interpretation of c, they still suppose that C must carry some physical energy. Compared with the energy of a quantum system, this energy is negligible (as in the example with the ship). Another interesting consequence of Bohm – Hiley considerations is that quantum systems might have rather complex internal structure (roughly speaking a quantum system must contain some device to transfer information obtained from the C-field).
102
A. Khrenniko6 / BioSystems 56 (2000) 95–120
In ordinary quantum mechanics the origin of the C-field is not clear. It seems natural for me that C(q) is generated by a quantum particle. However, this assumption is too speculative for material quantum systems, because there are no experimental evidences that a quantum particle has a complex internal structure such that it could generate the C-field.6 In the pilot wave formalism it is supposed that the C-field is created simultaneously with a quantum particle (this field is only formally treated as separated from the particle). In quantum cognitive theory the assumption on a complex internal structure of a quantum (conscious) cognitive system is quite natural. In principle we may suppose that the conscious field C(q), qXmen, is generated by classical information processes in a cognitive system t. Moreover, it is natural to suppose that higher information complexity of t implies that the C-field of t induces the information force fC of larger information magnitude. We recall that in the pilot wave formalism (both for material and mental systems) the magnitude of C is not directly related to the magnitude of fC. At the present stage of knowledge on cognitive phenomena the idea that the C-field is generated by t seems to be the most natural.7
4. Collective unconscious and conscious cognitive phenomena In the previous two sections we have studied the classical and quantum mechanical formalisms 6 Even the Bohm–Hiley considerations on the complex internal structure of quantum particles do not go so far to assume that a quantum particle is a generator of the C-field. The Bohm – Hiley complexity is merely complexity of a receiver of radio signals on a ship. 7 On the other hand, if we try to generalize ideas of material quantum mechanics to the cognitive phenomena, then we have to suppose that the C-field is created simultaneously with the creation of a cognitive system t. Such a viewpoint on the origin of the conscious field implies the great mystery of the act of creation of a conscious cognitive system. Here the conscious field is ignited (by whom?) in a cognitive system. Thus it seems to be impossible to create artificial cognitive systems by just ‘mechanical’ increasing of their information complexity.
for individual cognitive systems. In this section we consider collective classical (unconscious) and quantum (conscious) cognitive phenomena. We start with the classical (unconscious) cognitive mechanics. Let t1, …, tN be a family of cognitive systems with mental spaces Xmen,1, …, Xmen.N. We introduce mental space Xmen of this family of cognitive systems by setting Xmen = Xmen,1 × … × Xmen,N. Elements of this space are vectors of information states q=(q1, …, qN ) of individual cognitive systems tj. We assume that there exists an information potential V(q1, …, qN ) which induces information forces fj (q1, …, qN ). The potential V is generated by information interactions of cognitive systems t1, …, tN as well as by external information fields. The evolution of the motivation pj (t) and the information state qj (t) of the jth cognitive system tj is described by equations: p; j (t)= fj (t, q1, ... , qN ), pj (0)= p0j q; j (t)= pj (t), q(0)= q0j,
t, q, pXmen
(9) (10)
In general for different j these evolutions are not independent.
4.1. Example 4.1 Let V(ql, q2)= a(q1 − q2)2, where a is some information constant (given by a p-adic number in the mathematical model). Motions of cognitive systems t1 and t2 in mental space are not independent; the (information) magnitude of the constant of coupling a gives the strength of this dependence. On the other hand, if, for example, V(q1, q2)= q 21 + q 22, then motions of t1 and t2 are independent. This model can be used not only for the description of collective cognitive phenomena for a group of different cognitive systems t1, …, tN but also for a family of thinking processes in one fixed cognitive system. For example, it is natural to suppose that the brain contains a large number of dynamical thinking processors (see Khrennikov (1997) for a mathematical model), p1, …, pN which produce ideas q(t), …, qN (t), related to different domains of human activity.8 We can apply 8 For example, p1 produces ideas on food, p2 produces ideas on sex, p3 writes poems.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
our classical cognitive (collective) mechanics to describe the simultaneous functioning of thinking modules p1, …, pN.The main consequence of our model is that ideas q1(t), …, qN (t) and motivations p1(t), …, pN (t) do not evolve independently. Their simultaneous evolution is controlled by the information potential V(q1, …, qN ). It must be underlined that an interaction between thinking modules p1, …, pN has the purely information origin. The potential V(q1, …, qN ) need not be generated by physical field (for example, the electromagnetic field). A change of the information state qj q %j (or motivation pj p %) j of one of thinking processors pj will automatically imply (via the information interaction V(q1, …, qN )) a change of information states (and motivations) of all other thinking blocks. In principle no physical energy is involved in this process of the collective cognitive evolution. In some sense this is the process of the cognitive (but still unconscious) self-regulation. Different cognitive systems can have different information potentials V(q1, …, qN ) which give different types of connections between thinking blocks pj.
4.2. Example 4.2 Let thinking processors p1, p2 and p3 be responsible for science, food and sex, respectively. Let
a family of cognitive systems. The classical information motion is described by classical (unconscious) information forces9 fj (t, q1, …, qN ) by cognitive second Newton law (5). However, as in the pilot wave formalism for many particles, for any family t1, …, tN of cognitive systems, there exists a C-field, C(q1, …, qN ), of this family. This field is defined on the mental space Xmen = Xmen,1 × … Xmen,N. This field generates additional information forces fj (t, q1, …, qN ) (conscious forces) and the Newton’s (classical/unconscious) cognitive dynamics must be changed to (quantum/ conscious) cognitive dynamics p; j (t)= fj (t, q1, ... , qN )+ fj,C (t, q1, ... , qN ), j= 1, 2, ... , N.
(12)
In general the conscious force fj,C = fj,C (t, q1, …, qN ) depends on all information coordinates q1, …, qN (information states of cognitive systems t1, …, tN). Thus the consciousness of each individual cognitive system tj depends on information processes in all cognitive systems t1, …, tN. The level of this dependence is determined by the form of the collective C-function. As in the ordinary pilot wave theory in our cognitive model the factorization N
C(t, q1, ... , qN )= 5 Cj (t, qj ) j=1
V(q1, q2, q3) = a1q 21 + a2q 22 + a3q 23 +a12(q1 −q2)2 + a23(q2 −q3)2 +a13(q1 −q3)2.
103
(11)
If the information constant a1, strongly dominates over all other information constants, then the scientific thinking block p1 works practically independent from the blocks p2 and p3. If a12 (or a13) dominates over all other constants, then there is the strong connection between science and food (or science and sex). Moreover, the information potential V can depend on the mental time of a cognitive system, V= V(t, q1, …, qN ). Thus at different instances of mental time t a cognitive system can have different information connections between thinking blocks p1, …, pN. We are now going to describe the collective quantum (conscious) phenomena. Let t1, …, tN be
of the C-function implies that the conscious force fj,C depends only on the coordinate qj. Thus the factorization of C eliminates the collective conscious effect. As in the classical cognitive mechanics, the above considerations can be applied to a system of thinking blocks p1, …, pN of the individual cognitive system t (for example, the human brain). The conscious field C of t depends on information states q1, …, qN of all thinking blocks.
9 Throughout this paper we use ‘classical’ and ‘quantum’ as synonyms of ‘unconscious’ and ‘conscious.’ In fact, it would be better to use only the biological terminology. But we prefer to use also the physical terminology to underline the parallel development of mechanical formalisms for material and mental systems.
104
A. Khrenniko6 / BioSystems 56 (2000) 95–120
5. Information connection between mental and physiological processes Classical and quantum fields, V(q1, …, qN ) and C(q1, …, qN ), induce dependence between individual thinking blocks p1, …, pN of a cognitive system t or individuals t1, …, tN belonging to a social group G. In particular, this implies that all physiological systems of the organism are closely connected on the information level. Therefore a decease in one of these systems may have an influence to other systems (even if they have no close connection on the physiological level). Of course, this is not a new fact for medicine. But we now have the mathematical model (see Sections 7 – 9). And, in principle, we could (at least after development of the model) compute some effects of the information influence on physiological processes. Moreover, purely mental processes in the brain (which are not directly related to physiological processes) are connected on the information level with physiological processes. For example, let the mental block p1, control functioning of the heart and the block p2 controls some psychological process (for example, relations with some person) and let the classical information potential V(q1, q2) = aq1q2, where a is a coupling information constant (given by a p-adic number in our mathematical model). Then purely mental process in p2 has an influence to functioning of the heart. The classical information force f(q1, q2) applied to the p1 is equal to − aq2. Thus it depends on the evolution q2(t) of the psychological process. The presence of the conscious field C(q1, …, qN ) makes the connection between physiological and purely mental process more complicated. There is the possibility of the conscious control of human physiological systems. In principle, if a person could change its conscious field C(q1, …, qN ), she/he could change (by just an information influence) the functioning of some physiological systems. Our model explains well the origin of homeopathy. In fact, by a homeopathic treatment it is possible to change the information potential V(q1, …, qN ) of the organism. Microscopic quantities of medicines which are used in the home-
opathic treatment are just sources of information. In principle, homeopathic medicine need not be applied directly to an ill physiological system pk (described by the information state qk ). The information concentrated in the homeopathic medicine could be applied to some other information state qj, j" k. The change of qj, qj q %,j will imply the change of the trajectory qk (t) (via the change of the information force fk (t, q1, …, qk, …, qj, … qN ) fk (t, q1, …, qk, …, q %,j …, qN )). 6. Freud’s psychoanalysis as a reconstruction of conscious field By Freud’s theory, Freud (1933), mental space Xi of a human individual i is split in two domains: (1) a domain of conscious ideas X ci ; (2) a domain of unconscious ideas X ui . Thus Xi = X ci @X ui . In our information model Freud’s idea is represented in the following way. Let f: Xi Xi, be some function. As usual, we define a support of f as the set supp f ={x Xi : f(x)" 0}. Let C be the conscious field generated by the individual i and fC be the corresponding conscious force. Then supp fC is the set of conscious ideas (ideas which can interact with the C-field), X ci = supp fC. The set Xi ¯supp fC is the set of unconscious ideas X ui (ideas which cannot interact with the Cfield).10 The motion of i in the space of ideas Xi is described by the dynamical system: p; i (t)= f(t, qi )+ fC (t, qi ),
qi Xi
(13)
(Vi is the classical (unconscious) (qi force generated by the classical information po(C tential Vi of i and fC = − i is the quantum (qi (conscious) force generated by the conscious information potential Ci of i. In the subspace X ui of where f= −
We remark that the sets of ideas, supp fC and supp C, do not coincide. It can be that supp fC is a proper subset of supp C. 10
A. Khrenniko6 / BioSystems 56 (2000) 95–120
unconscious ideas this dynamical system is reduced to the system: p; i (t)= f(t, qi ),
qi X ui .
(14)
Let D be some domain in X ui and let a classical information potential Vi (t, qi ), qi X ui , have a form such that the dynamical system (14) has the domain D as a domain of attraction of trajectories. Thus starting with any initial idea q0 X ui the information state qi (t) of i will always evolve to D. The dynamical system (14) is located in the space of unconscious ideas. Here the conscious force fC is equal to zero. Therefore the i could not change consciously the dynamics (14). Suppose now that the D is some domain of ‘bad ideas’. For example, if D is a domain of ‘black ideas’, then i has a depression; if D is a domain of ideas connected with alcohol, then i has problems with alcohol; if D is a domain of aggressive ideas, then i will demonstrate aggressive behavior (this behavior looks as totally unmotivated: starting with an arbitrary unconscious idea q0 the individual i will always arrive to aggression). The aim of psychoanalysis is to extend the domain of conscious ideas X ci =supp fC. This extension will perturb dynamics (14) by the action of a conscious force fC. This perturbation may change the evolution of ideas in such a way that the domain D will not be anymore a domain of attraction for the whole space of unconscious ideas X ui . Starting with q0 X ui the i can have trajectories qi (t) which will be never attracted by the domain of ‘bad ideas’ D. The pair, a cognitive system i and a psychoanalytic p, can be considered as a coupled system of transformers of information. The information coupling between i and p will generate a new information classical potential Vi,p (t, q1, q2) which is defined on mental space X = Xi ×Xp, where Xi and Xp, are spaces of ideas of the individual i and the psychoanalytic p, respectively. Dynamics of the conscious field Ci(t, qi ) of i is described by the Schro¨dinger equation h (Ci h 2 ( 2C i (t, qi )= (t, qi ) − Vi (t, qi )C(t, qi ) i (t 2 (q 2i (15)
105
Dynamics of the conscious field Ci,p (t, qi, qp ) of the system (i, p) is described by the Schro¨dinger equation h (Ci,p (t, qi, qp ) i (t =
h2 (2 (2 + Ci,p (t, qi, qp ) 2 (q 2i (q 2p − Vi,p (t, qi )Ci,p (t, qi, qp ).
(16)
If now the conscious force f0 C (t, qi, qp )= −
(Ci,p (t, qi, qp ) "0 (qi
(17)
for some ideas qi X ui at least for some ideas qp Xp, then the motion of i in the domain of unconscious ideas X ui can be controlled consciously (here Ci,p (t, qi, qp ) is the conscious potential induced by Ci,p (t, qi, qp )). In fact, this means that X ui is reduced and X ci is extended. The aim of the psychoanalytic p is to find ideas qp Xi such that (17) takes place for unconscious ideas qi X ui of i. As the process of psychoanalysis is a conscious process (at least for p), it is natural to assume that ideas qp, used by p to induce condition (17) are conscious: qp X cp. Typically such ideas are represented in the form of special questions to i. In some sense this is a kind of conscious intervention of the psychoanalytic p in the unconscious domain of the individual i. If p finds a domain O¦ X ui in that condition (17) is satisfied, then in this domain dynamics (14) is transformed in the conscious dynamics p; i (t)= f0 (t, qi, qp )+ f0 C (t, qi, qp ),
qi O.
(18)
Under some circumstances this dynamical system can be free from ‘pathological features’ of dynamical system (14). Of course, even the change of the classical potential Vi (t, qi ) to a new classical potential Vi,p (t, qi, qp ) changes the motion of i: a new dynamics is ruled by the classical force f0 (t, qi, qp ) instead of the classical force f(t, qi ). However, it is not easy to change strongly the classical force f (t, qi ) on the domain of unconscious ideas X ui by just the change of the classical potential. Typically Vi,p (t, qi, qp) = Vi (qi )+ Vp (qp ) + G(qi, qp ), where the (information) magnitude of G(qi, qp ) is small for ideas qi X ui . On the other hand, this minor change of the
106
A. Khrenniko6 / BioSystems 56 (2000) 95–120
classical potential may induce the strong change of the quantum potential.
6.0.1. Conclusion Freud’s psychoanalysis is nothing than the change of information dynamics of an individual i (having some mental decease) via an extension of the support of the quantum force. Such an extension is the extension of the domain of conscious ideas X ci (and the reduction of the domain of unconscious ideas X ui ). This extension is realized by the information coupling between an individual i and a psychoanalytic p. By minor change of the classical information potential the p strongly changes the conscious force acting on the i. Dynamics of ideas in the unconscious domain of i is changed. This change eliminates the mental decease. In the same way we can describe information processes which take place in hypnotism. Here by the conscious information coupling (described by Schro¨inger equation (Eq. (16))) between an individual i and a hypnotizer p the conscious dynamics (13) is changed in such a way that the conscious force fC (t, q1) is practically totally eliminated by the action of the conscious force f0 C (t, qi, qp ). For example, let f0 C (t, qi, qp ) = − fC (q1)+ fC (q2). Then information behavior of the i is ‘ruled’ by the conscious force fC (q2) of the p. 7. Mathematical models of material and mental spaces; real and p-adic numbers From our viewpoint real spaces (Newton’s absolute space or spaces of general relativity) give only a particular class of information spaces. These real information spaces are characterized by the special system for the coding of information and the special distance on the space of vectors of information. Any natural number m\ 1 can be chosen as the basis of the coding system. Each x [0, 1] can be presented in the form: x = a0a1 ... an ... ,
(19)
where aj = 1, …, m −1, are digits. We denote the set of all sequences of the form (19) by the symbol Xm. For example, let us fix m =10. One of the
main properties of the real cording system is the identification of the form: 10 ... 0 ...= 09 ... 9 ...; 010 ... 0 ...= 009 ... 9 ...; ... (20) In fact, this identification is closely connected with the order structure on the real line R (and the metric related to this order structure). For each x, there exist ‘right’ and ‘left’ hand sides neighborhoods; there exist arbitrary small right and left shifts. The identification (20) is connected with the description of left hand side neighborhoods.
7.1. Example 7.1 Let x= 10 … 0 … . Then x can be approximated from the left hand side with an arbitrary precision by numbers of the form y=09 … 90 … . The following description of right hand side neighborhoods will be very important in our further considerations. (AS) Let x=a0 … am … . Then the numbers (vectors of information) which are close to the x from the right hand side have the form y= b0 … bm …, where a0 = b0, …, am = bm for sufficiently large m. This nearness has a natural information (cognitive) interpretation: (AS) implies the ability to form associations for cognitive systems which use this nearness to compare vectors of information. By (AS) two communications (two ideas in a model of human thinking, Khrennikov (1997)) which have the same codes for sufficiently large number of first (the most important) positions in cording sequences are identified by a comparator of a cognitive system. Numbers (vectors of information) which are close to x from the left hand side could not be characterized in the same way (see Section 7.1, there x and y are very close but their codes differ strongly).
7.1.1. Conclusion The system of real numbers has been created as a coding system for information which the consciousness receives from reality. The main properties of this coding system are the order structure on the set of information vectors and the restricted ability (see (AS)) to form associations.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
Finally, we pay attention to the ‘universal coding property’ of the real system: any natural number m\ 1 can be used as the basis of this system. Thus any information process can be equivalently described by using, for example, 2-bits coding or 1997-bits coding. All these properties of the real coding system were incorporated in every physical model. I do not think that all information processes (especially cognitive) have an order structure. On the other hand, the scale of coding system m\1 may play the important role in a description of an information process. Let us ‘modify’ the real coding system. We eliminate the identification (20). Since now, there is no order structure on the set Xm. of information vectors. We consider on Xm the nearness defined by (AS)11. This nearness can be described by a metric. The corresponding (complete) metric space is isomorphic to the ring of so called m-adic integers Zm (see Schikhov, 1984). Therefore it is natural to use m-adic numbers for a description of information (at least cognitive) processes. Mathematically it is convenient to use prime numbers m= p \1 (see Schikhov (1984)). We arrive to the domain of an extended mathematical formalism, p-adic analysis. We present some facts about p-adic numbers. The field of real numbers R is constructed as the completion of the field of rational numbers Q with respect to the metric r(x, y) = x −y , where · is the usual valuation given by the absolute value. The fields of p-adic numbers Qp are constructed in a corresponding way, but using other valuations. For a prime number p, the p-adic valuation · p is defined in the following way. First we define it for natural numbers. Every natural number n can be represented as the product of prime numbers, n= 2r23r3 … p rp …, and we define np =p − rp, writing 0 p =0 and − n p = n p. We then extend the definition of the p-adic valuation · to all rational numbers by setting n/m p = n p / m p for m" 0. The completion of Q with respect to the metric r(x, y)= x −y p is the locally compact field of p-adic numbers Qp. The number fields R and Qp are unique in a sense, since by Ostrovsky’s theo11 Thus here all information is considered from the viewpoint of associations.
107
rem (see Schikhov (1984)) · and · p are the only possible valuations on Q, but have quite distinctive properties. Unlike the absolute value distance · , the p-adic valuation satisfies the strong triangle inequality x+ y p 5 max[ x p, y p ],
x, yQp
Write Ur(a)= {xQp: x− a p 5 r} and U − r (a)= {x Qp: x− a p B r}, where r= p n and n= 0, 9 1, 9 2, … . These are the ‘closed’ and ‘open’ balls in Qp while the sets Sr (a)= {x Qp: x− a p = r} are the spheres in Qp of such radii r. These sets (balls and spheres) have a somewhat strange topological structure from the viewpoint of our usual Euclidean intuition: they are both open and closed at the same time, and as such are called clopen sets. Another interesting property of p-adic balls is that two balls have nonempty intersection if and only if one of them is contained in the other. Also, we note that any point of a p-adic ball can be chosen as its center, so such a ball is thus not uniquely characterized by its center and radius. Finally, any p-adic ball Ur (0) is an additive subgroup of Qp, while the ball U1(0) is also a ring, which is called the ring of p-adic integers and is denoted by Zp. Any xQp has a unique canonical expansion (which converges in the · p -norm) of the form x =a − n /p n + ...a0 + ... + akp k + ... where the aj {0, p− 1} are the ‘digits’ of the p-adic expansion. The elements x Zp have the expansion x= a0 + ... + akp k + ... and can thus be identified with the sequences of digits x= a0 ... ak ... The p-adic exponential function n=0
xn . The n!
series converges in Qp if x p 5 rp, where rp = 1/p,
p" 2 and r2 = 1/4 (21)
p-adic trigonometric functions sin x and cos x are defined by the standard power series. These series have the same radius of convergence rp as the exponential series.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
108
If, instead of a prime number p, we start with an arbitrary natural number m \1 we construct the system of so called m-adic numbers Qm by completing Q with respect to the m-adic metric rm (x, y)= x − y m which is defined in a similar way to above. 8. Hamiltonian dynamics on p-adic mental space The rings of p-adic integers ZP can be used as mathematical models for mental spaces. Each elej can be identified with a sement x= j = 0 aj p quence x= a0a1 … aN …, aj =0, 1, …, p− 1. Such sequences are interpreted as coding sequences (in the alphabet Ap ={0, 1, …, p − 1} with p letters) for some amounts of information. The p-adic metric rp (x, y)= x −y p on Zp corresponds to the nearness (AS) for information sequences. We choose the space X = Zp (or multidimensional spaces X =Zp N) for the description of information. Everywhere below we shall use the abbreviation ‘I’ for the word information. We use an analogue of the Hamiltonian dynamics on mental spaces As usual, we introduce the d quantity p(t)= q; (t) = q(t) which is the indt formation analogue of the momentum, a moti6ation. The space Zp × Zp of points z =(q, p) where q is the I-state and p is the motivation is said to be a phase mental space. As in the ordinary Hamiltonian formalism, we assume that there exists a function H(q, p) (I-Hamiltonian) on the phase mental space which determines the motion of t in the phase mental space:
q; (t)=
(H (q(t), p(t)), (p
p; (t)= −
q(t0) = q0,
(H (q(t), p(t)), p(t0) =p0. (q
(22)
The I-Hamiltonian H(p, q) has the meaning of an I-energy (or mental, or psychical energy, compare Freud (1933)). In principle, I-energy is not directly connected with the usual physical energy. The simplest I-Hamiltonian Hf (p) = a 2p, a Zp describes the motion of a free cognitive system t, i.e. a cognitive system which uses only self-motivations for changing of its I-state q(t). Here by
solving the system of the Hamiltonian equations we obtain: p(t)= p0, q(t)= q0 + 2ap0(t−t0). The motivation p is the constant of this motion. Thus the free cognitive system ‘does not like’ to change its motivation p0 in the process of the motion in the mental space. If we change coordinates, q%= (q− q0 )/k, k= 2ap0, then we see that the dynamics of the free cognitive system coincides with the dynamics of its mental time. In general case the I-energy is the sum of the I-energy of motivations Hf = ap 2 (which is an analogue of the kinetic energy) and potential I-energy V(q): H(q, p)= ap 2 + V(q) The potential V(q) is determined by fields of information. We now consider examples which illustrate the notion of mental time.
8.1. Example 8.1, reading of a book We consider again the example of Section 3. Let us enumerate words in the language of book B by 1, 2, …, m− 1 (including blank symbol). Denote by 0 words which have zero information value for t (for example, special terms which are not known by t). The text of B can be represented as an information string: x= (a0, a1, …, aN, …, aM ). This string can be identified with an element of Zm, by setting aj = 0, j] M+1: x= (a0, a1, …, aN, …, aM, 0, …, 0, …). Counts of such a mental time are given by blocks of x: t= (a0, a1, … , ak, 0, 0, …). Suppose now that the information state (knowledge) q is coded in the following way: q= (b0, b1, …, bj, …), bj = 0, 1, …, k− 1, where b0 is dynasty, b1 is number of wars during dynasty b0… The symbol 0 is again used to denote zero knowledge12. Dynamics q(t), bj = cj (a0, ..., aN, …), of knowledge of t is described by Hamiltonian equations13. 12 In this example, the use of the homogeneous k-adic tree is not so natural. It would be more natural to use a nonhomogeneous tree in that the number kj of branches depends on the information characteristic bj, see, for example, Fig. 2. 13 In fact, the mathematical formalism developed in this paper describes only the case m =k. To study dynamics for t Zm, q Zk, k " m, we need more complicated mathematical analysis.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
109
plays the crucial role in many psychological experiments. We can not obtain sensible observations for interactions between arbitrary individuals. There must be a process of learning for the group t1, …, tN which reduces mental times t1, …, tN to the unique mental time t. Thus, let us consider a group t1, …, tN of cognitive systems with the internal time t. The dynamics of I-states and motivations is determined N by the I-energy; H(q, p), q ZN p , pZp . It is natural to assume that Fig. 2. The factorial tree ZM for m1 = 2, m2 = 3, m3 = 4, …
N
H(q, p)= % aj p 2j + V(q1, ... , qN ),
8.2. Example 8.2, e6olution of scientific psychology We introduce a mental time t = tps which is used for describing the evolution of the psychological state of a scientist t. Let t= (a0, a1, …, aN, …), where aj =0, 1, …, m − 1, is a number of publications of t in journals of the weight t. Journals with j= 0 are the most important, journals with j =1 are less important and so on. For example, let m = 10. Such a mental time has no order structure. For example, take l1,= (2, 0, …)= 2, l2 =(0, 8, 0, …) = 80, l3 = (1, 1, 2, 0, …)=211. These instances of mental time could not be ordered according their importance. The evolution of the psychological state q(t) of t is described by trajectory in the mental space (in the simplest case Xmen =Zm,). If this trajectory is continuous, then t will have similar psychological states q(t1), q(t2) for close instances of mental time t1, t2. In the Hamiltonian framework we can consider interactions between cognitive systems t1, …, tN These cognitive systems have mental times t1, …, tN and I-states q1(t1), …, qN (tN ). By our model we can describe interactions between these cognitive systems only in the case in that there is a possibility to choose the same mental time t for all of them. In this case we can consider the evolution of the system of the cognitive systems t1, …, tN as a trajectory in the mental space ZN p = Zp × ... × Zp q(t) = (q1(t), ... , qN (t)). We think that the condition of consistency t1 =t2 = ... = tN =t
aj Zp.
j=1
(23)
2 Here Hf (p)= N j = 1 aj p j is the total energy of motivations for the group t1, …, tN and V(q) is the potential energy. As usual, to find a trajectory in N the phase mental space ZN p × Zp we need to solve (H the system of Hamiltonian equations: qj = , (pj (H pj = − , qj (t0)= q0, pj (t0)= p0. (qj
8.3. Remark 8.1, acti6e information Our ideas about information and information field are similar to the ideas of Bohm and Hiley (1993) (see especially pp. 35− 38). As Bohm and Hiley, we do not follow ‘Shannon’s ideas that there is a quantitative measure of information that represents the way in which the state of a system is uncertain for us’, Bohm and Hiley (1993). We also consider information as an acti6e information. Such information interacts with cognitive systems. As a consequence of such interactions cognitive systems produce new information. The only distinguishing feature is that material objects are not involved in our formalism. According to Bohm and Hiley active information interacts with material objects (for example, the ship guided by radio waves). Bohm and Hiley assume that information fields have nonzero physical energy that directs other (probably very large) physical energy. However, physical energies are not involved in our model. Thus we need not assume that I-fields have some physical energy. In particular, we need not try to find (as Bohm and Hiley (1993), p. 38) an origin of such an energy.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
110
We also remark that Bohm and Hiley (1993) discuss only quantum c-fields. We shall use both classical and quantum I-fields. Bohm and Hiley discussed a difference between ‘active’ and ‘passive’ information. In fact, our model supports their conclusion that ‘all information is at least potentially active and that complete passivity is never more than an abstraction …’, Bohm and Hiley (1993), p. 37. If a cognitive system t moves in the field of forces C (classical or quantum), then the information xsupp V is active for t and the information x Zm p ¯supp V is passive for t. Let 6= V(t, x) be a time p dependent potential. Then the set of active information X(t) =supp V(t) evolves in mental space. Thus some passive information becomes active and vice versa.
plies also a variation dp of the motivation p:dp = fdt. The coefficient f of proportionality is called an I-force. Thus any change of the motivation is due to the action of an I-force f. If f= 0 then dp= 0 for any variation dt of t. Thus a cognitive system cannot change its motivation in the absence of I-forces. By analogue with the usual physics we call the coefficient a of a proportion between the variation dv of the I-velocity 6 and the variation dt of the mental time t, d6 = adt, an I-acceleration. Thus dp = amdt. This relation can be rewritten in the form of an information analogue of the second Newton law:
9. Inertia of information
f= −
We have considered dynamics of cognitive systems of the unit mass. There the coefficient 6 of proportionality between the variation dq of the I-state and the variation dt of mental time t: dq = 6dt, was considered as a motivation. In the general case the motivation p may not coincide with 6. Let us assume that the motivation p is proportional to 6, p =m6, m Zp. This coefficient m of proportionality is called an I-mass. p We also call 6 an I-velocity. Thus dq = dt m Let t1 and t2 be two cognitive systems with the I-masses m1, and m2 and let m1 p \ m2 p. Let t1, and t2 have the variations dt1, dt2 of mental time of the same p-adic magnitude, dt1 p = dt2 p, and let these variations generate the variations dq1 and dq2 of their I-states of the same p-adic magnitude, dq1 p = dq2 p. To make such a change of the I-state, t1 needs a larger motivation:
where V is called the potential, or potential energy. The total I-energy H is defined as the sum of the kinetic and the potential I-energies,
p1 p =
) )
) )
dq dq m1 p \ p2 p = m . dt p dt p 2 p
Thus the I-mass is a measure of an inertia of information. We define a kinetic I-energy by 1 2 T= p . A variation dt of mental time t im2m
ma=f or p; = f.
(24)
An I-force f is said to be a potential force if there exists a function V(q) such that (V (q
H(q, p)=
1 2 p + V(q). 2m
The Hamiltonian equation p; = −
(H (q
coincides with the Newton equation p; = f. In p-adic analysis the condition f 0 does not imply that a differentiable function f is a constant, see Schikhov (1984) or Escassut (1995). There exist complicated continuous motions (q(t), p(t)) in the I-phase space for cognitive systems with zero I-energy (q; 0 or p; 0). In psychological models these motions can be interpreted as motions without any motivation. Such motions do not need information force. On the other hand, we can consider an I-poten(V tial V(q) such that = 0. Here the potential (q I-energy V(q) can have complicated behavior on the mental space X= Zp. At the same time the I-force f= 0. Thus there may exist I-fields which do not induce any I-force.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
All mathematical pathologies can be eliminated by the consideration of analytical functions. If f % 0 and f is analytic then f = constant14.
10. p-adic model for the conscious (quantum) mechanics It is quite natural to quantize classical mechanics on information spaces over Zp. We give the following reasons for such quantization. Observations over I-quantities are statistical observations. We have to study statistical ensembles of cognitive systems (instead studying of an individual cognitive system). Such statistical ensembles are described by quantum states f. As usual in quantum formalism, we can assume that a value l of an I-quantity A can be measured in the state f with some probability Pf (A=l). This ideology is nothing than the application of the statistical (ensemble) interpretation of quantum mechanics (see, for example, Ballentine (1989)) to the information theory. By this interpretation any measurement process has two steps: (1) a preparation procedure E; (2) a measurement of a quantity B in the states f which were prepared with the aid of E. Let us consider these steps in the information framework. By E we have to select a statistical ensemble f of cognitive systems on the basis of some I-characteristics. Typically in quantum physics a preparation procedure E is realized as a filter based on some physical quantity A, i.e. we select elements which satisfy the condition A= m where m is one of the values of A. We can do the 14 In psychological models we can interpret analytical trajectories in the phase mental space as a ‘normal behavior’, i.e. an individual needs a motivation for the change of a psychological state. Here we can observe some psychological (information) force that induces this change. There is a psychological (information) field that generates this force. Trajectories (nonanalytical) with zero motivation are interpreted as abnormal psychological behavior (probably such trajectories correspond to mental diseases; on the other hand, they may explain anomalous phenomena). Here an individual changes his psychological state without any motivation in the absence of any psychological (information) force. Here, in fact, a p-adic generalization of the Hamiltonian formalism does not work. We need to propose a new physical formalism to describe such phenomena.
111
same in quantum I-theory. An I-quantity A is chosen as a filter, i.e. cognitive systems for the statistical ensemble f are selected by the condition A=m where mZp is some information. For example, we can choose A= p, the motivation, and select a statistical ensemble f= f(p=m) of cognitive systems which have the same motivation mZp. Then we realize the second step of a measurement process and measure some information quantity B in the state f(p = m). For example, we can measure the I-state q of cognitive systems belonging to the statistical ensemble described by f(p = m). We shall obtain a probability distribution P(q = l p= m), l, mZp (a probability that cognitive system has the I-state q= l under the condition that it has the motivation p=m). It is also possible to measure the I-energy E of cognitive systems. We shall obtain a probability distribution P(E = l p=m), l, mZp. On the other hand, we can prepare a statistical ensemble f(q = m) by fixing some information mZp and selecting all cognitive systems which have the I-state q= m. Then we can measure motivations of these cognitive systems and we shall obtain a probability distribution P(p= l q= m). Another possibility is to use a generalization of the individual interpretation of quantum mechanics. By this interpretation a wave function C(x), xRn, describes the state of an individual quantum particle. In the same way we may assume that a wave function C(x), xZnp, on the mental space describes the state of an individual cognitive system t.15 In fact, a mathematical model for quantum I-formalism has been already constructed. This is quantum mechanics with p-adic valued functions, see Khrennikov (1994, 1997) and Albeverio and Khrennikov (1998). We present briefly this model. The space of quantum states is realized as a p-adic Hilbert space K (see Khrennikov (1994, 1997) and Albeverio and Khrennikov (1998) for the theory of such spaces). This is a Qp-linear space which is a Banach space (with the norm · ) and on which 15 The problem of interpretations is the important problem of ordinary quantum mechanics on real space. The same problem arises immediately in our quantum I-theory. We do not like to start our investigation with a hard discussion on the right interpretation. We can be quite pragmatic and use both interpretations by our convenience.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
112
is defined a symmetric bilinear form (·, ·):K × KQp. This form is called an inner product on K. It is assumed that the norm and the inner product are connected by the Cauchy– Bunaykovski–Schwarz inequality: (x, y) p 5
x y , x, yK. By definition quantum I-state f is an element of K such that (f, f) = 1; quantum I-quantity A is a symmetric bounded operator A: K K, i.e. (Ax, y) =(x, Ay), x, y K.16 We discuss a statistical interpretation of quantum states in the case of a discrete spectrum of A. Let {l1, …, ln, …} lj Zp be eigenvalues of A, Afn,= lnfn, fn K, (fn, fn ) =1. The eigenstates fn of A are considered as pure quantum I-states for A, i.e. if the system of cognitive systems is described by the state fn then the I-quantity A has the value ln Zp with probability 1. Let us consider a mixed state
f= % qnfn,
qn Qp,
(25)
n=1 17 2 where (f, f)= By the statistical n = 1 q n =1. interpretation of f if we realize a measurement of the I-quantity A for cognitive systems belonging to the statistical ensemble described by f then we obtain the value ln with probability P(A = ln f)= q 2n. The main problem (or the advantage?) of this quantum model is that these probabilities belong to the field of p-adic numbers Qp. The simplest way is to eliminate this problem by considering only finite mixtures (25) for which qn Q (the field of rational numbers Q is a subfield of Qp ). In this case the quantities P(A = ln f) =q 2n can be interpreted as usual probabilities (Kolmogorov (1956)). Therefore we may assume that there exist (can be prepared) quantum I-states f which have the standard statistical interpretation: when the number N of experiments tends to infinity, the frequency nN (A= ln f) of an observation of the information ln Zp tends to the probability q 2n.
16 In p-adic models we do not need to consider unbounded operators, because all quantum quantities can be realized by bounded operators (see Albeverio and Khrennikov, 1998). 17 As in the usual theory of Hilbert spaces, eigen-vectors corresponding to different eigen-values of a symmetric operator are orthogonal.
However, we can use a more general viewpoint to this problem. In the book Khrennikov (1994) a (non-Kolmogorov) probability model with p-adic probabilities has been developed. If we use a p-adic generalization of a frequency approach to probability (see von Mises, 1957), then p-adic probabilities are defined as limits of relative frequencies nN with respect to the p-adic topology. By using the p-adic frequency probability model for the statistical interpretation of quantum I-states we may assume that there exists I-states f (ensembles of cognitive systems) such that the relative frequencies nN (A=ln f) have no limit in R, i.e. we cannot apply the standard law of the large numbers in this situation. Hence if we realize measurements of an I-quantity A for such a quantum I-state and study the observed data by using the standard statistical methods (based on real analysis), then we shall not obtain a definite result. There will be only random fluctuations of relative frequencies, see Khrennikov (1994).18 The evolution of a p-adic wave function is described by an I-analogue of the Schro¨dinger equation: hp (C h 2 ( 2C (t, x)= p (t, x)− V(t, x)C(t, x), i (t 2m (x 2 (26) where m is the I-mass of a quantum cognitive system. Here a constant hp plays, the role of the Planck constant. By pure mathematical reasons (related to convergence of p-adic exponential and trigonometric series) it is convenient to choose hp = 1/p We may also present some physical arguments for such a choice. In ordinary quantum mechanics the Planck constant is related to the measure of discretization. The constant hp = 1/p is related to the level of discretization of information.
18 Such a behavior can be related to psychological experiments. Here the possibility of the use of p-adic probability models gives the important consequence for scientists doing experiments with a statistical I-data: the absence of the statistical stabilization (random fluctuation) does not imply the absence of an I-phenomenon. This statistical behavior may have the meaning that this I-phenomenon cannot be described by the standard Kolmogorov probability model.
A. Khrenniko6 / BioSystems 56 (2000) 95–120
The use of i implies the consideration of the extension Qp (i ) =Qp ×iQp of Qp. Elements of this extension have the form z =a +ib, a, b Qp. This extension is well defined for p =3, mod 4. As usual, we introduce a convolution z¯ =a −ib; here we have zz¯ = a 2 +b 2. In what follows we assume that wave functions take values in Zp (i )=Zp× iZp.
10.1. Example 10.1, a free conscious system Let the potential V = 0. Then the solution of the Schro¨dinger equation corresponding to the I-energy E=p 2/2m has the form19: Cp(t, x)= e i(px − Et)/hp.
(27)
By the choice hp =1/p this function is well defined for all xZp and t Zp. As CC( 1, this wave function describes the uniform (p-adic probability) distribution, see Khrennikov, 1994, on the ring of p-adic integers Zp. Thus a cognitive system t in the state C can be observed with equal probability in any state x Zp. In this sense behavior of a free cognitive system is similar to behavior of the ordinary free quantum particle. On the other hand, there is no analogue of oscillations: Cp(t, x)= cos(px − Et)/hp + i sin(px −Et)/hp, and cos(px − Et)/hp p =1, sin(px −Et)/hp p = (px −Et)/hp p. We will consider this example again in Section 12.
11. The p-adic pilot wave theory for cognitive systems Let us consider a system of N cognitive systems, t1, …, tN, with Hamiltonian p 2k + k \ i Vki (xk −xi ). 2mj The wave function C(t, x), x =(x1, …, xN ), xk Zm p (where m is the dimension of mental space which H=k
19
We note that formal expressions for analytical solutions of p-adic differential equations coincide with the corresponding expressions in the real case (in fact, we can consider these equations over arbitrary number field, see Khrennikov, 1994). However, behaviors of these solutions are different.
113
is used for the description of the I-state of tj ) evolves according to the Schro¨dinger equation (Eq. (26)). A purely mathematical consequence of this is that ( ( r(t)+ % jk (x, t)=0, (t k (xk
(28)
where r(t, x)= C(t, x)C(t, x) is a probability density on the configuration mental space ZmN and p
1 jk (x, t)= m − k Im C(t, x)
( C(t, x) . (xk
As in the ordinary Bohm’s formalism, we assume that a quantum cognitive system tk has at any mental time20 well defined I-state xk and motivation pk. I-state xk evolve according to x; k (t)=
jk (t, x) . r(t, x)
(29)
This model gives the natural description of an evolution of the I-states of a system of cognitive systems t1, …, tN. We consider now a system S of brains t1, …, tN. The wave function C(t, x) of the S depends on I-states of all brains in the S. Thus motions of are not these brains in phase mental space Z2m p independent. At the same time there might be no classical potential V which induces such a dependence. Of course, if (as in ordinary real formalism) C(t, x)= N j = 1Cj (t, x), then the I-motions of brains tj are independent. There are no correlations between consciousness of different brains.
11.1. Remark 11.1, non-locality This is the good place to discuss the problem of non-locality of the pilot wave formalism. Often non-locality is considered as one of the main difficulties of the pilot wave formalism. However, non-locality is not a difficulty in our pilot I-wave formalism. This is non-locality in the mental space. Such non-locality can be natural for some I-systems. For cognitive systems, I-non-locality 20
Of course, we assume that mental times t1, … , tN the cognitive systems t1, … , tN satisfy the condition of consistency (Eq. (23)).
A. Khrenniko6 / BioSystems 56 (2000) 95–120
114
means that ideas which are separated in a p-adic space can be correlated. However, p-adic separation means only that there are no strong associations between ideas or groups of ideas. But this absence of associations does not imply that these ideas could not interact. By our model each human society S has a wave function C(t; x). The same considerations can be applied to animals and plants. The only difference is probably that here quantum I-potentials are not so strong. Thus we get the conclusion that there may exist a wave function Cliv(t, x) of all living organisms. The wave function C(t, x)liv(t, x) can be represented in the form: C(t, x)liv(t, x)= % Cf (t, x),
(30)
f
where Cf (t, x) is a wave function of the living form f. An observable F (a living form) can be realized as a symmetric operator in a p-adic Hilbert space, FCf =fCf, where f Zp is the code of the living form f in the alphabet {0, 1, …, p− 1}. By Eq. (29) the evolution of the fixed form f0 depends on evolutions of all living forms f. The process of the evolution of living forms is not just a process based on Darwin’s natural selection. This is a process of a quantum I-evolution in that the conscious field of all living forms plays the important role. This model might be used to explain some phenomena that could not be explained by Darwin’s theory. For example, the beauty of colors of animals, insects and fishes could not be consequence only of the process of the natural evolution. This is a consequence of the structure of the conscious field C(t, x)liv(t, x).21 By the same reasons we can explain some aspects of relations between robbers and victims. It seems that in nature there is a well organized system which gives robbers a possibility to eat victims. This system is nothing else than a result of the evolution due to Eq. (29). 21 Of course, at the moment we cannot find such a function C(t, x)liv(t, x) which induces the real distribution of colors. In any case our model implies the existence of such a field. Therefore, the process of colors’ evolution is a process of the simultaneous evolution of colors of numerous living forms. These colors do not serve only to the ‘convenience’ of concrete forms (as it should be due to Darwin’s theory), but they were produced by the correlated evolution of all living forms.
The C(t, x) can be considered as a new cognitive system, T (compare with Bohm and Hiley, 1993). The T gets information from cognitive systems t1, …, tN (via the classical field V) and T changes I-states of t1, …, tN (via Eq. (29)). The only distinguishing feature is that the I-state of T cannot be identified with a point in the mental space Zm p for a finite m. However, if we extend the I-formalism by using infinitely dimensional mental spaces over Zp, then T can be considered as a cognitive system. Thus each cognitive system or a group of cognitive systems induce a new cognitive system T (the consciousness) which evolves in infinitely dimensional mental space. In principle, T may induce a new field F(t, C(·)). This field determine a quantum potential for T. The field F(t, C(·)) can be again considered as a cognitive system T 1(which evolves in infinite dimensional mental space). It is the consciousness of (the consciousness T). If such a construction can be repeated many (or infinitely many?) times, then there is a ‘conscious tower’, T, T 1, …, T n, … . We might speculate that the motion of a cognitive system in mental space is determined by the hierarchic conscious system T, T 1, …, T n, … . Of course, the effect of T 1 is not present in the linear Schro¨dinger equation. If we assume the hypothesis on the hierarchic conscious structure, then the linear Schro¨dinger equation has to be changed to nonlinear equation. Hence the cognitive considerations support de Broglie’s ideas about nonlinear perturbations of Schro¨dinger’s equation (see De Broglie, 1964).
12. Conscious field as memory activation field The reader may ask: ‘Why do we need to use an I-analogue of quantum mechanics to describe conscious phenomena? Why is it not sufficient to use only the classical I-dynamics (with Newton’s I-equation, (4))?’ There are two reasons to introduce a conscious field c(t, q): (D) Our conscious experience (see Section 2.1) demonstrates that I-motion of a conscious cognitive system tcons could not be described by classical Newton’s I-equation (4). Classical I-forces do not provide the right balance of ‘guiding’ forces
A. Khrenniko6 / BioSystems 56 (2000) 95–120
for the I-state q(t) of tcons. One of possibilities to perturb classical I-equation, (4), of I-motion is to use an analogue with material objects (which can be considered as a particular class of transformers of information, see Khrennikov 1997, 1999) and introduce a new (conscious) force via the c-function which satisfies Schro¨dinger I-equation, (Eq. (26)). Of course, such an approach would be strongly improved if we could explain the mechanism of generation of the conscious c-field by tcons. We note that such a mechanism is still unknown in the ordinary pilot wave theory for material objects, see Bohm and Hiley, 1993. (S) It is impossible to perform a measurement of the I-state q as well as the motivation p or some other I-quantity of a conscious system tcons without to disturb this system.22 In particular, we could not perform a measurement of both q and p for the same tcons. As in the ordinary quantum mechanics, we have to use large statistical ensembles S of conscious systems and perform statistical measurements for such ensembles to find probabilities for realizations of I-quantities. One of possibilities is to use again an analogue with the ordinary quantum formalism and introduce a field of probabilities c(t, q) (which describes statistical properties of the ensemble S) such that the square of the amplitude of c C(t, q)2 =c(t, q)c( (t, q) gives the probability to find tcons S in the I-state q (at the instant t of mental time). One of the main features of the pilot wave theory for material objects is that the pilot wave field coincides with the probability field: c(t, x)= cpilot(t, x)= c(t, x)prob (in fact, there is no clear explanation of such a coincidence). We also postulate that the conscious c-field (which generates the conscious force fC coincides with the probability field. Let tcons be a conscious system. A performance of the I-state q= q(t) of tcons can be viewed as a The process of an I-measurement is an interaction of tcons with some I-potential V(t, q). Any tcons is extremely informational unstable. Even low information potentials V(t, q) disturb tcons. Such an I-instability is the common feature of conscious and quantum systems. 22
115
performance of an image on some screen S. This screen continuously demonstrates ideas of tcons. The tcons is a self-observer for these ideas. As we have already noted, our conscious experience says that a new I-state q(t+ Dt) (an ‘image on S’ at the instant t+ Dt) is generated not only on the basis of the previous I-state q(t) (an ‘image on S’ at the instant t) with the aid of I-forces f(t, q) which are generated by external I-potentials. The main feature of tcons is that q(t+ Dt) depends on all information which is collected in tcons.23 Thus Newton’s I-equation (Eq. (4)) must be modified to describe such a unity of information in the process of the I-evolution of tcons. Denote by D(t) Dt cons(t) the domain in Xmen corresponding to information which is contained in the memory of tcons at the moment t. Then q(t+ Dt)= F(q(t), D(t)). The main problem is to find a transformation F that can provide the adequate description of the conscious evolution of the I-state. We propose the following model. For each idea xD(t), we define an I-quantity c(t, x) which describes the I-activity of the idea x. In this way we introduce a new I-field c(t, x), x D(t), a field of memory acti6ation. It is postulated that the field of memory activation gives the pilot wave c(t, x), conscious field, which guides the I-state q(t) of tcons. Let t and t% be two different instances of time. Schro¨dinger I-equation (Eq. (26)) implies that c(t%, x), can be found as the result of integration of c(t, x) over the memory domain D(t): c(t%, x)=
&
K(t%, t, x, y)c(t, y)dy
(31)
D(t)
where the kernel K(t%, t, x, y) describes the time propagation of memory activation. K(t%, t, x, y) describes the influence of an idea yD(t) on the idea xD(t%). One of the main consequences of the quantum I-formalism is that memory activation fields at different instances of time are connected by a linear transformation. The Bohmian mechanics implies that the quantum force fQ does not depend on the amplitude R of a quantum field c. Roughly speaking the 23 Hiley and Pylkka¨nen (1997) call such a property ‘the unity of consciousness’ (see also Searle, 1992).
116
A. Khrenniko6 / BioSystems 56 (2000) 95–120
strength of fQ depends on the variation of R: fQ = − Q%, where a quantum potential Q is defined as Q = −R%%/R. For example, if R Const, then fQ =0; if R(x) =x (rather slow variation), then fQ is still equal to 0; if R(x) = x 2, then fQ :"0.24 The differential calculus on the mental space Xmen =Zp, where p \1 is a prime number, reproduces the same analytic properties (in particular, the form dependence) for conscious potentials C(x). Moreover, we provide the explanation of this form (in fact, variation) dependence. In our model the pilot information c-function is interpreted as a memory activation field. Our conscious experience says that the uniformly high activation of all ideas in the memory of tcons could not imply a conscious behavior. Uniform activation (C(x)= Const, x Dtcons) eliminates at all a memory effect from the evolution of the I-state q(t) of tcons. Only rather strong variation of the field C(x), x Dtcons, of memory activation produces a conscious perturbation of the I-motion of tcons.
12.0.1. Conclusion The variation dependence of the conscious (quantum) force fC on the information pilot wave c is a consequence of the fact that c is nothing else than a field of memory activation of a conscious system. Of course, the information pilot wave theory predicts essentially more than we can extract from our vague conscious experience. In fact, (as in the Bohmian mechanics) fC =(C§C− C¦C%)/C 2. Thus the conscious force fC depends not only on the first variation dC of the amplitude of the memory activation field, but also on the second and third variations, d 2C, d 3C.
24 We remark that the ordinary pilot wave theory could not provide a reasonable explanation of such a connection between the field and force. In fact, D. Bohm and B. Hiley understood that this is due to the information nature of the pilot wave. However, they still tried to reduce such a new field to some ordinary physical field.
These predictions must be verified experimentally.
12.1. Remark 12.1, neurophysiologic links In the spatial neurophysiologic models of memory spatially extended groups of neurons represent human images and ideas. Thus it is possible (at least in principle) to construct a transformation Js : Bs Bi, where Bs ¦ Xmat = R3 is the spatial domain of a ‘physical brain’ and Bi ¦ Xmen = Zkm is the I-domain of an ‘information brain’. Thus the memory activation field c(x), xBi, can be represented as an I-field on the spatial domain Bs : f(u)= c(Js (u)). We now consider the simplest model with the 0/1-coding (nonactivated/activated) of activation of ideas x in the memory. Roughly speaking the information pilot wave theory predicts that only large spatial variations of f(u) can imply conscious behavior. Of course, this is a vague application of our (information) pilot wave formalism. The transformation Js, can disturb the smooth structure of the mental space Zkm and variations with respect to neuron’s location uBs ¦ R3 may have no links to variations with respect to idea’s location xBi ¦ Zkm. Moreover, the purely mathematical experience says that this is probably the case. The most natural transformations a: Zm R have images of fractal types (Vladimirov et al., 1994). This can be considered as a reason in favor of the frequency domain model, Hoppensteadt, 1997, by that dust-like configurations of neurons in Bs, (oscillating with the same frequency) seem to correspond to the same idea (image). In the latter model it is possible to construct a transformation Jf : Bf Bi, where Bf ¦ R3 is the domain of a ‘frequency brain’. Thus the field c(x), x Bi, can be represented as an I-field on the frequency domain Bf : f(n)= c(Js (n)). The above experimental predictions can be also tested for variations of f(n). We now discuss briefly connection between the memory activation field and the probability field. Let us consider a large statistical ensemble S of conscious systems t having (at least approximately) the same memory activation field
A. Khrenniko6 / BioSystems 56 (2000) 95–120
c(x). Suppose that initial I-states q 0t of t S are uniformly distributed. After some period DT of the conscious evolution via (Eq. (8)), where fC is defined by the stationary field c(x), the probability P(qt (t)= q) to find t S in the I-state q (at the instance t) is equal to C 2(q). Thus the memory activation field coincides with the probability field due to the stationary (statistical) stabilization of information states of conscious systems tS. Intervals DT of such a stabilization to a stationary distribution are relatively small. Therefore only stationary distributions are observed.
12.2. Remark 12.2, a quantum particle as a complex I-system The Bohmian mechanics cannot explain the origin of the pilot wave field. We can try to use an analogue between conscious and quantum systems to clarify this point. Let as consider (following Bohm and Hiley, 1993) a quantum particle s as a complex I-system. Suppose that such a system has a kind of memory. This memory contains not only information on contemporary ‘properties’ of s, but also information based on the previous experience of s. If we use the anthropological principle and apply the conscious I-model to s, then the Bohmian pilot wave is nothing else than a field of memory activation of s. The I-state q(t) of s is given by the coordinates of location of s in R3. Thus s is guided not only by classical potentials (which can be considered as just a particular class of classical I-potentials), but also by the potential of the memory activation field of s. As a consequence of the statistical stabilization to a stationary distribution, the memory activation field coincides with the field of probabilities. In fact, such an approach unifies the Bohmian and Copenhagen interpretations of quantum mechanics. By the Copenhagen interpretation s has no definite position before a position measurement; s is in the ‘superposition’ of different positions. By the Bohmian theory s has the definite position at each instant of time. In our model ‘superposition’ of positions is nothing else than
117
memory on these positions; the ‘real’ position of s is the I-state of s.
12.3. Example 12.1, a free conscious system Let us consider the memory activation field of a free conscious system tcons, see Section 10.1.25 Suppose that the memory of tcons is activated by some concrete motivation, p= a (which is not mixed with other motivations). Then ca (x)= eiax/hp, where hp = 1/p and p\ 1 is a prime number (the basis of the coding system of tcons). Such a field can be called a moti6ation wa6e. This motivation wave propagates via ISchro¨dinger equation: ca (t, x)= exp{i(ax − Eat)/ hp }, where Ea = a 2/2m is the information (‘psychical’) energy of the motivation p= a. Here S(t, x)=(ax −Eat) and C(t, x) 1. Thus the conscious force fC 0.26 Suppose now that two different motivations, p=a and p=b activate the memory of tcons. The corresponding motivation waves are ca (x)= eiax/hp and cb (x)= eibx/hp. Suppose that these waves have amplitudes da, db Qp. Suppose also that there exists a phase shift, u, between these two waves of motivations in the memory of tcons. One of consequences of the quantum I-formalism is that the total memory activation field c(t, x) is a linear combination of these motivation waves: c(x)=dae iu/hpca (x)+ dbcb (x). The presence of the phase u implies that the motivation p=a started to activate the memory earlier than the motivation p= b (at the instant s= − u/Ea ). If the motivation p= a has a small I-energy, namely, Ea p B B 1 then a nontrivial phase shift u can be obtained for rather large time shift s. The I-motion in the presence of two different (‘competitive’) motivation waves in the memory of tcons is quite complicated. It is
Such a tcons is an extremely idealized conscious system. A conscious system could not be totally isolated from external I-fields. In any case tcons must continuously receive information on physiological processes in his body. 26 As we have already noticed, the uniformly strong activation of all ideas in the memory of a conscious system implies unconscious behavior. 25
118
A. Khrenniko6 / BioSystems 56 (2000) 95–120
guided by a nontrivial conscious force fC (t, x). We omit rather complicated mathematical expression which formally coincides with the standard expression (Holland, 1993). As the memory activation field coincides with the field of probabilities, probabilities to observe motivations p= a and p=b are equal to d 2a and d 2b. As it was already mentioned, in general these are not rational numbers. Thus in general these probabilities could not be interpreted as ordinary limits of relative frequencies. There might be violations of the law of large numbers (the stabilization of frequencies) in measurements on conscious systems (see Khrennikov, 1999, for the details). Complexity of the I-motion essentially increases if c is determined by k ] 3 different motivations. Finally we remark that (in the opposite to the Bohmian mechanics) waves ca, and cb, a " b are not orthogonal. The covariation B ca, cb \ "0. Thus all motivation waves in a conscious system are correlated.
12.4. Example 12.2, conscious e6olution of complex biosystems Let t be a biosystern having a high I-complexity and let l1, …, lM be different living forms belonging to t. We consider the biosystem t as an I-object (transformer of information) with the I-state q t(t). It is supposed that t has a kind of collective memory. Let c t(t, x) be the memory activation field of t. I-dynamics of t depends not only on ‘classical’ information fields, but also on the activation of the collective memory of t. Suppose that t can be considered (at least approximately) as an I-isolated biosystem. Each living form lj has a motivation aj (to change the total information state q t). The pilot I-formalism implies that the total motivation p t of the biosystem t could not be obtained via the summation of motivations aj. The mechanism of
27 For some biosystems, amplitudes dj, j= 1, … , M, can be chosen as sizes of populations of lj, j=1, … , M.
generation of p t is more complicated. Each aj activates in the collective memory of t the motivation wave caj. A superposition of these waves gives the memory activation field (‘conscious field’) of t. c t(t, x)= % dj e iuj /hpcaj (t, x) j
(compare with Eq. (30)).27 The c(t, x) induces rather complicated conscious potential C t(t, x) which guides the motivation p t and I-state q t of t. Finally we discuss the correspondence between states of brain and states of mind. The thesis that to every state of brain there corresponds a unique state of mind is often called the materialistic axiom of cognitive science, see Bergson, 1919. In fact, this axiom is the basis of the modern neurophysiologic investigations. However, there are some reasons to suppose that the brain does not determine the content of the mind (see Hautama¨ki, 1997, for the extended analysis of this question). Here we refer only to Putnam’s theory of meanings. According to Putnam, 1988, ‘meanings are not in the head’, they are rather in the world, and reference is a social phenomenon. We shall prove that in our mathematical model for mental processes the materialistic axiom is violated. Let us consider again a free conscious system tcons. It will be shown that motivations of tcons could not be identified with waves of memory activation. Suppose that there exists a fixed motivation p =a which activates the memory of tcons. We know that the field ca (x)= eixa/hp is the eigen-function of the position operator pˆ. Thus, for an ensemble Sa of free systems with the same memory activation field ca (x), observations of the motivation will give the value p= a with the probability 1. Let l(x) and u(x) be arbitrary differentiable functions, Zp Zp, iS(x)/ having zero derivatives. Set c l,u a (x)= R(x)e l(x) hp where R(x)= e and S(x)= ax − u(x). The c l,u a (x) is also an eigen-function of the motivation operator. Thus observations of the motivation for an ensemble S l,u of conscious systems a
A. Khrenniko6 / BioSystems 56 (2000) 95–120
with memory activation field c l,u a (x) will also give the motivation p= a. However, the fields c l,u a (x) and ca (x) can have extremely different distributions of activation of ideas in the memory. By any ‘social’ (external) observer all these fields (states of brain) are interpreted as the same state of mind, namely the motivation p = a.
Appendix A A.1. p-adic differential calculus The system of p-adic numbers Qp is a number field. Thus the operations of addition, subtraction, multiplication and division are well defined. The derivative of a function f: Qp Qp is defined (as usual) as lim Dx p 0 f(x+ Dx)−f(x) . The main distinguishing feature Dx of p-adic analysis is the existence of non-locally constant functions with zero derivative. We present the following well known example (see Schikhov, 1984), p.74. The function f: Zp Zp is 2n n defined as f(x) = for x = n = 0 anp n = 0 anp . This function is injective ( f(x1) " f(x2) for x1 " x2) and f % 0.
References Albeverio, S., Khrennikov, A.Yu., De Smedt, S., Tirozzi, B., 1998. p-adic dynamical systems. Theor. Math. Phys. 114 (3), 349 – 365. Albeverio, S., Khrennikov, A.Yu., 1998. A regularization of quantum field Hamiltonians with the aid of p-adic numbers. Acta Appl. Math. 50, 225–251. Albeverio, S., Khrennikov, A.Yu., Kloeden, P.E., 1999. Human memory as a p-adic dynamical system. Byosystems 49, 105 – 115. Amit, D.J., 1989. Modeling Brain Functions. Cambridge University Press, Cambridge. Ballentine, L.E., 1989. Quantum Mechanics. Englewood Cliffs, New Jersey. Bell, J., 1987. Speakable and Unspeakable in Quantum Mechanics. Cambridge University Press. Bergson, H., 1919. Energie Spirituelle. Alcan, Paris. Bohm, D., 1951. Quantum Theory. Prentice-Hall, Englewood Cliffs, New Jersey.
119
Bohm, D., Hiley, B., 1993. The Undivided Universe: an Ontological Interpretation of Quantum Mechanics. Routledge and Kegan Paul, London. Borevich, Z.I., Shafarevich, I.R., 1966. The Number Theory. Academic Press, New York. Clark, A., 1980. Psychological Models and Neural Mechanisms. An Examination of Reductionism in Psychology. Clarendon Press, Oxford. Cohen, J.D., Perlstein, W.M., Braver, T.S., Nystrom, L.R., Noll, D.C., Jonides, J., Smith, E.E., 1997. Temporal dynamics of brain activation during working memory task. Nature 386 (April 10), 604 – 608. Courtney, S.M., Ungerleider, L.G., Keil, K., Haxby, J.V., 1997. Transient and sustained activity in a disturbed neural system for human working memory. Nature 386 (April 10), 608 – 611. Dawkins, R., 1976. The Selfish Gene. Oxford University Press, New York. De Broglie, L., 1964. The Current Interpretation of Wave Mechanics. A Critical Study. Elsevier, Amsterdam. Dubischar, D., Gundlach, V.M., Steinkamp, O., Khrennikov, A.Yu., 1999. A p-adic model for the process of thinking disturbed by physiological and information noise. J. Theor. Biol. 197, 451 – 467. Eccles, J.C., 1974. The Understanding of the Brain. McGraw-Hill, New York. Escassut, A., 1995. Analytic Elements in p-adic Analysis p. 80. World Scientific, Singapore. Freud, S., 1933. New Introductory Lectures on Psychoanalysis. Norton, New York. Hiley, B., Pylkka¨nen, R., 1997. Active information and cognitive science — a reply to Kieseppa¨. In: Pylkka¨nen, P., Pylkko¨, P., Hautama¨ki, A. (Eds.), Brain, Mind and Physics. IOS Press, Amsterdam. Holland, R., 1993. The Quantum Theory of Motion. Cambridge University Press, Cambridge. Hoppensteadt, F.C., 1997. An Introduction to the Mathematics of Neurons: Modeling in the Frequency Domain, second ed. Cambridge University Press, New York. Hautama¨ki, A., 1997. The main paradigms of cognitive science. In: Pylkka¨nen, P., Pylkko¨, P., Hautama¨ki, A. (Eds.), Brain, Mind and Physics. IOS Press, Amsterdam. Khrennikov, A.Yu., 1994. p-adic Valued Distributions in Mathematical Physics. Kluwer Academic Publishers, Dordrecht. Khrennikov, A.Yu., 1997. Non-Archimedean Analysis: Quantum Paradoxes, Dynamical Systems and Biological Models. Kluwer Academic, Dordrecht. Khrennikov, A.Yu., 1998a. Human subconscious as a p-adic dynamical system. J. Theor. Biol. 193, 179 – 196. Khrennikov, A.Yu., 1998b. p-adic information dynamics and the pilot wave theory for cognitive systems. Report N. 9839, Va¨xo¨ University, Department of Mathematics.
120
A. Khrenniko6 / BioSystems 56 (2000) 95–120
Khrennikov, A.Yu., 1999. Interpretations of Probability. VSP International Scientific Publishers, UtrechtTokyo. Kolmogorov, A.N., 1956. Foundations of Probability. Chelsea, New York. Lorenz, K., 1966. On Aggression. Harcourt, Brace and World, New York. von Mises, R., 1957. Probability, Statistics and Truth. Macmillan, London. Putnam, E., 1988. Representations and Reality. A Bradford
Book. MIT Press, Cambridge. Schikhov, W., 1984. Ultrametric Calculus. Cambridge University Press, Cambridge. Searle, J., 1992. The Rediscovery of the Mind. MIT Press, Cambridge. Skinner, B.F., 1953. Science and Human Behaviour. Macmillan, New York. Vladimirov, V.S, Volovich, I.V., Zelenov, E.L., 1994. p-adic Analysis and Mathematical Physics. World Scientific Publishers, Singapore.