Entropy based approximation to cell monolayer

0 downloads 0 Views 240KB Size Report
of cell monolayer observed by microphotography. Keywords— Cell state, state trajectory, information entropy,. Rényi entropy, state space. I. INTRODUCTION.
1

Entropy based approximation to cell monolayer development T. Náhlík, J.Urban, P. Císař, J. Vaněk and D. Štys Department of Bioengineering, Institute of physical biology, University of South Bohemia, Zamek 136, 373 33 Nove Hrady, Czech Republic Email: [email protected] Abstract— Our analysis is based on the assumption that the metabolic and signal pathways substantiate non-linear dynamic processes which are responsible for asymptotic stability of biological systems. Observed events, not pathways themselves, are the elementary asymptotically stable objects to be studied. This assumption is theoretically supported by observed structures in a relatively simple pseudo-chemical agentbased model. The state space is given by coefficients of Rényi information entropy. Each combination of Rényi coefficients gives us one subtraction of the whole trajectory. The single cell trajectory should be divided into several clusters, once the trajectory is constructed. Each cluster of the trajectory represents an event or subset of the states of the cell. Size and position of clusters depends on the trajectory and on number of clusters. We propose a recipe, using the rules of the generalized stochastic system theory, how to extract individual trajectories of the multifractal system – the biological stochastic attractor. This set of trajectories is then utilized for thermodynamic representation of the biological chaotic attractor. Our analysis is strictly based on experiment with all its practical limits. We present implementation of analysis of development of cell monolayer observed by microphotography Keywords— Cell state, state trajectory, information entropy, Rényi entropy, state space

I. INTRODUCTION In physical chemistry, we assume that Gibbs (1876) [1] or Helmholtz (1882) [2] energy (later free energy) is the potential that characterizes the thermodynamic equilibrium. After recognition of quantum mechanics, this potential has been identified by maximum Boltzmann (or Gibbs) entropy (Boltzmann 1896) [3], which is the true beginning of statistical physics. In experimental practice of the free energy is often approximated by logarithm of concentration. This approximation – from experimental chemist point of view - comes from ideal gas equation and brings about endless problems in any system obeying generals state equations. For this reason, activity and activity coefficient were introduced. In the field of statistical thermodynamics the use of GibbsBoltzmann- Shannon entropy is based on several assumptions on equality of probability distribution of individual

Entropy based approximation to cell monolayer development

states and is not general. Details of mathematical approaches and terminology should be sought in Renyi [4], Havrda and Charvat [5], Tsallis [6] and Jizba and Arimitsu [7]. In biology the models are based on multitude of observation from biochemistry and molecular biology. The metabolic and signal pathways substantiate the non-linear dynamic processes [8] which are responsible for asymptotic stability of biological systems. The cellular membranes and intracellular structures are traditionally expected to be “complex lipid vesicles” in which lateral domains are formed by “immiscible” mixtures of lipids and proteins eventually forming complicated structures which are observed by the microscope. However, formation of complicated three-dimensional structures of various shapes was also obtained using a relatively simple pseudo-chemical agent-based non-equilibrium model [9]. Not surprisingly, these structures are formed under the conditions of maximization of Renyi entropy [10]. In analogy to equilibrium thermodynamics, pathways are analogues to concentrations while the spatial objects are analogues to phases. Phase equilibrium is the most typical case for equilibrium based on Gibbs energy identity in each of the components. In this article we deal mainly with information content of data obtained from experiments relevant to systems biology. The time lapse microscopy has got major branches in fluorescence microscopy which intents to follow a singular process in the cell (intracellular biochemistry) and contrast enhancement based observation of cell fates [11, 12]. In this article we summarize results of the objective analysis of critical steps in both above mentioned types of experiments and propose generic approach. II. GENERIC APPROACH TO STOCHASTIC SYSTEMS ANALYSIS A. General stochastic system For the analysis of experiment may be naturally used the generalized stochastic systems theory developed for cybernetic purposes by Zampa (2004) [13]. In this approach, the system is characterized by its attributes A for which it holds

2

A = {ai i ∈ I }

C k ,l = Tk × I l ,

(1)

where I is adequate non-empty index set. We expect that a i ≠ a j for each i ≠ j , i, j ∈ I . System is measured in ordered series of timer instants

tk ∈ T

,

(2)

where K is again an adequate index set. Good abstract model of an attribute is a variable vi

vi ∈ V , i ∈ I ,

such that

z (t , i ) ∈ Vi , i ∈ I

C k ,l ⊂

m U Dk −1,l j =1

(7)

which is a recipe how to determine complete causal relation in the system. We simply have to determine variables and extent the time interval of the considered preceding trajectory so long as we are able to determine the future state fully from the preceding state. Then and only then we may write causal relation on the system C

{

(3)

U Vi i∈I

as

}

C = (C k ,l , Dk .l ) Dk ,l ∈ D

This approach enables to recognize the dissection of the “true” system parameters; system attributes, from measured system parameters, the variables. In the above mentioned case of equilibrium chemical thermodynamics the system attribute may be the activity, the system variable the concentration. We may then define system trajectory as mapping

z :T × I →

(6)

(8)

Schematically the principle of search for causal relation is given in Figure 1.

, (4)

where we defined the set of all system variables as Cartesian product sub-sets corresponding to individual attributes. The stochastic systems theory by Zampa [13] is actually highly general and considers broad classes of measurements of natural phenomena. B. Elements and structure of the stochastic causal systems In dynamic system we consider the causality. Let us consider trajectory of the system z at the definition set D defined as

D =T×I

(5)

and a reduced mapping z Dij defined on a subset

Di , j = Ti × I j [14]. This accounts for the fact that we generally do not measure all system variables and that system evolves in continuous time while we measure only at certain time instant. Ti is one time interval from the ordered decomposition T of the set T. We may define complete immediate cause

z C kl defined on a subset of the state

trajectory

Entropy based approximation to cell monolayer development

Figure 1. Definition of complete causal relation. The piece of the trajectory Dk,l is fully determined by values of several preceding variables. The number of variables and length of the decisive interval has no relation to experiment time. If all the state variables are time independent, this state definition is identical with classical thermodynamic definition.

where D is ordered decomposition of the definition set D which naturally comes out from the causality search defined by the equation (11). In analogy we may define causal probability that a trajectory segment Dk,l will be realized if a segment Ck,l shall occur

(

Pk ,l z Dk ,l : z C k ,l

)

and the set P of all probabilistic mappings

(9)

3

P = {Pk ,l }

(10)

Finally we may define probabilistic causal system PCS

PCS = (T ,V , C, P )

(11)

For cases in which all variables are time – independent (stationary, non-inertial) the system definition is identical with classical thermodynamic definition. In addition, it is possible to break (uncouple) this causal relation. Such relation is an information bond. The information bond is oriented and we may thus unequivocally determine the input and output of the system. In each system it is possible to find parts which either do not communicate or communicate only by information, not energetic, bonds. Let us consider causal system

CS = (T ,V , C) .

(12)

If we then uncouple all these causal relations, the system breaks into several isolated systems. Causal sub-system which does not contain any further information bonds is then called element of the causal system and denoted

π i = (T ,Vi , C i )

(13)

the set of all elements

Π = {π i i ∈ J }

(14)

is called universum of the system and the whole causal system may be described as

CS = (Π, C)

(15)

Obviously we equally well determine probabilistic casual system [14].

PCS = (Π , C, P )

(16)

C. Rényi information entropy and clustering of state trajectory Cells are living unrepeatable objects. If we want to know how they are living and evolving in time we cannot do it through invasive techniques, which modify or kill the cell. In non-invasive techniques, the content of the data is given and we have to maximize information gain. Acquiring of images is one of broadly used non-invasive technique. We extract information from the image by information entropy approach. A general image, acquired by any generic microscopic techniques, is subjected to transformation

Entropy based approximation to cell monolayer development

which evaluates the information contribution of each point in the image. Resulting values may also be used for construction of the cell or, differently said, to objective assessment of individual cell state. For plotting the state trajectory of the cell we need to know state space variables. But the whole cell is complex object with many states and nobody knows the number of all events exactly. We decided to use method developed on our institute which is called Point Information Gain (PIG) [14]. It is based on Rényi entropy. Hα ( X ) =

1  n  log ∑ piα  1−α  i =1 

(17)

Where α is called Rényi entropy coefficient [4, 7]. The color channels and different Rényi entropy coefficients may be combined to best discriminate individual events. The construction of state trajectory of the cell is not a simple task. It must be decided what are the state variables. Since we work with colored images, it sounds logical to start with the trajectory in RGB color space. Each axis is then processed via several Rényi entropies. The value of Rényi coefficient spread or collapsed the trajectory along the axis of the state space. Each combination of Rényi coefficients gives us one subtraction of the whole trajectory. The major issue is how to choose the proper combination. However, all subtractions are correct. The single cell trajectory should be divided into several clusters, once the trajectory is constructed. Each cluster of the trajectory represents an event or subset of the states of the cell. Size and position of clusters depends on the trajectory and on number of clusters. We proposed eight clusters now as a first estimation, satisfied by the results: Clusters are well separated; images in transitions between clusters show some changes. Changes are also observable in images at the borders of the clusters. If we compare content of the clusters in different trajectories some images stay in the same cluster but some images may change the cluster. This is caused by the method of acquisition of the trajectory. The different coefficient in the Rényi equation highlights different part of the image so the trajectory can be little bit different as well as size, position and content of clusters. Next step was to divide the trajectory into some regions with common properties. The analysis was based on hierarchical clustering method in MATLAB®. This method clusters points together by computing distance each point. Since we obtained from microscope color images, for example from phase contrast microscopy, it is possible to construct state trajectory. We started with only one organelle from the cell and after that we try to construct trajectory for whole cell. The first try of construction state trajectory was plotting red vs. green vs. blue channel gained from

4

photos. This 3D plot shows that individual attractors may be discriminated and that many state trajectories may be constructed. But for more complex cells e.g. MG63 – the human bones cells – we need to transform color channels via Rényi entropy. This gives us better discriminated regions in state trajectory.

unnecessary theoretical complication, this type of description looks fully justified for biological systems.

ACKNOWLEDGMENT We thank to Petr Jizba for valuable discussions which opened us new avenues of thinking and to Pavel Žampa († 2006) who invented the visionary framework of thinking which we use. This work was supported and co-financed by the ERDF and made possible by the INTERREG IVC programme, project Innovation 4 Welfare, subproject PICKFIBER; by the South Bohemian Research Center of Aquaculture and Biodiverzity of Hydrocenoses (CZ.1.05/2.1.00/01.0024); by the Ministry of Education, Youth and Sports of the Czech Republic under the grant MSM 6007665808; and by the South Bohemia University grant GA JU 152/2010/Z.

REFERENCES 1.

2. 3. 4. Figure 3: MG63 state trajectory– X axis is red channel with α=0.7, Y axis is green channel with α=1.3 and Z blue channel with α=4, bottom graph is clustered in 8 regions.

III. CONCLUSION

5. 6. 7. 8.

In this paper we propose generic approach for construction of experiment inferred probabilities of appearance of signals origination from different sources – sub-systems – to a total signal – system. The general stochastic systems theory created by Zampa [13] is a careful analysis of dynamic behavior of real life systems in real experimental conditions. Resulting models account for internal processes in the system and account for inherently probabilistic nature of physical processes. This is done in very conservative way, when distinction is made between the idealized system attribute and system variable. The structure of the system is then described in terms of system elements, system information bonds, causal relation and causal probabilities. While for analysis of technical systems this seems to be

Entropy based approximation to cell monolayer development

9.

10.

11. 12. 13.

14.

W. Gibbs, "On the Equilibrium of Heterogeneous Substances", Transactions of the Connecticut Academy, III. pp. 108-248, Oct., 1875May, 1876, and pp. 343-524, May, 1877-July, 1878. H. V. Helmholtz, Wissenschaftliche Abhandlungen, Vol. I. Leipzig, 1882. Boltzmann, Ludwig, Vorlesungen über Gastheorie, 2. Volumes Leipzig 1895/98IFMBE at http://www.ifmbe.org A. Rényi, "On measures of information and entropy", Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, pp. 547-561, 1960. J. Havrda, P. Charvát, "Quantification method of classification processes: Concept of structural a-entropy", Kybernetica, 1, 30-35, 1967. C. Tsallis, "Possible generalization of Boltzmann-Gibbs statistics", Journal of Statistical Physics, vol. 52, p. 479-487, 1988. P. Jizba and T. Arimitsu, "The world according to Renyi: Thermodynamics of multifractal systems", Ann. Phys., 312 17 - 59, 2004. K. Kaneko Life: An Introduction to Complex Systems Biology, Springer, 2006. O. Cybulski and R. Holyst, "Three-dimensional space partition based on the first Laplacian eigenvalues in cells", Physical Rev. E., 77, 05601, 2008. O. Cybulski, D. Matysiak, V. Babin and R. Holyst, "Pattern formation in nonextensive thermodynamics: Selection criterion based on the Renyi entropy production", J. Chem. Phys., 122, 174105, 2008. F. Zernike, Phase-contrast, a new method for microscopic observation of transparent objects, Part I.., Physica: 9, 686-698, 1942. F. Zernike, Phase-contrast, a new method for microscopic observation of transparent objects, Part II.., Physica: 9, 974-986, 1942. P. Žampa, "The principle and the law of causality in a new approach to system theory", Cybernetics and Systems, Vienna: Austrian Society for Cybernetics Studies, s. 3-8., 2004. Urban J., Vaněk J., Štys D. Preprocessing of microscopy images via Shannon's entropy, In Proc. Of Pattern Recognition and Information Processing, pp.183-187, Minsk, Belarus, ISBN 978-985-476-704-8, 2009.

Suggest Documents