Document not found! Please try again

Defining Synergy Thermodynamically using Quantitative ...

5 downloads 0 Views 103KB Size Report
archical organization [3] and other concepts [4, 5] related during the last few ... No unambiguous definition of synergy exists, but several ..... PianoConcerto.Nro2.
Defining Synergy Thermodynamically using Quantitative Measurements of Entropy and Free Energy KLAUS JAFFE 1 AND GERARDO FEBRES 2 1

n Bolıvar, Caracas, Venezuela; and 2Departamento Universidad Simo De Procesos Y Sistemas, USB, Venezuela

Received 17 March 2016; revised 25 May 2016; accepted 26 May 2016

Synergy is often defined as the creation of a whole that is greater than the sum of its parts. It is found at all levels of organization in physics, chemistry, biology, social sciences, and the arts. Synergy occurs in open irreversible thermodynamic systems making it difficult to quantify. Negative entropy or negentropy (N ) has been related to order and complexity, and so has work efficiency, information content, Gibbs Free Energy in equilibrium thermodynamics, and useful work efficiency in general (W ). To define synergy in thermodynamic terms, we use the quantitative estimates of changes in N and W in seven different systems that suffer process described as synergistic. The results show that synergistic processes are characterized by an increase in N coupled to an increase in W . Processes not associated to synergy show a different pattern. The opposite of synergy are dissipative processes such as combustion where both N and W decrease. The synergistic processes studied showed a relatively greater increase in N compared to W opening ways to quantify energy—or information—dissipation due to the second law of thermodynamics in open irreversible systems. As a result, we propose a precise thermodynamic definition of synergy and show the potential of C 2016 Wiley thermodynamic measurements in identifying, classifying and analysing in detail synergistic processes. V Periodicals, Inc. Complexity 000: 00–00, 2016 Key Words: synergy; entropy; negentropy; evolution

INTRODUCTION

S

ynergy is often defined as the creation of a whole that is greater than the sum of its parts. It is found at all levels of organization in physics, chemistry, biology, social sciences, and the arts. The term, we believe,

Correspondence to: Gerardo Febres, Departamento De Procesos Y Sistemas, USB, Venezuela. E-mail: [email protected]

Q 2016 Wiley Periodicals, Inc., Vol. 00 No. 00 DOI 10.1002/cplx.21800 Published online 00 Month 2016 in Wiley Online Library (wileyonlinelibrary.com)

is equivalent to emergence [1], self-organization [2], hierarchical organization [3] and other concepts [4, 5] related during the last few decades to phenomena that have traditionally described as synergy. The term synergy, was first used in neuromuscular physiology [6], and was fully developed as a process involved in self-organization by Haken [7]. No unambiguous definition of synergy exists, but several nonadditive phenomena might qualify as

C O M P L E X I T Y

1

synergy. Wikipedia defines it as the creation of a whole that is greater than the simple sum of its parts. That is, 1 1 1 > 2. This phenomena seems to be quite common in nature [8–10], but a precise way of tackling the study of this phenomena has yet to be developed. An instructive example is the transformation of oxygen and hydrogen atoms to form a molecule of water, which has often been presented as a synergistic process. Certainly, the properties of water cannot be deduced from those of oxygen and hydrogen atoms: they differ radically. It seems a clear cut case of 1 1 1 > 2. The advent of quantum chemistry has allowed us to understand this phenomena in great detail, making the reference to synergy unnecessary. Thus, the presence of synergy in natural phenomena might just reveal our ignorance about them pinpointing areas where fundamental quantitative empirical research is needed. Thermodynamics helps to understand these processes better. Some thermodynamic studies [11] place synergistic interactions at the base of Self-organization in Complex Systems; whereas others [12] make equivalent efforts in analysing the working of our brains. Synergistic interactions have shown to be fundamental in understanding biological evolution [36–39] and economic dynamics [9, 10, 13]. In thermodynamics, chemical reactions have properties that relate to their complexity. Quantitative measurements of Helmholtz free energy and Gibbs free energy (G) or enthalpy have shown to be very useful in understanding chemical reactions. A similar approach should help us understanding synergy in complex systems. Thermodynamics focuses on the energy that produces useful work (W ). Under specific conditions of temperature and pressure, W is equivalent to Gibbs Free Energy. W can be estimated by measuring changes in the complexity of the system triggered by a given synergistic process. Energy and information are separate entities. Energy is tangible in terms of heat, and information is a statistical measure. However they share several thermodynamic properties [14]. In classical thermodynamics, energy can be related to work. Here we want to explore empirically if information can also be related to work using approaches such as those that estimate W based on Bayesian information criteria [15]. Estimates of entropy can be done directly or indirectly through estimates its information content [16, 17]. The concept of Shannon’s entropy is fundamental in understanding the emergence of synergy [18]. In information theory, Shannon’s entropy is the expected value (average) of the information contained in each message or structure that contains information. The entropy of the message is its amount of uncertainty; it increases when the message is closer to random, and decreases when it is organized in recognizable patterns. Named after Boltzmann’s Htheorem, Shannon defined the entropy H of a discrete random variable X as:

2

C O M P L E X I T Y

H ðxÞ52

X

P ðxi Þ logb P ðxi Þ;

i

where b is the base of the logarithm used, and P a probability function of states. H is often related to the complexity of a system, specially when entropy is related to orderliness. The lower H the more complex structure the system possesses. The assessment of complexity depends on how it is regarded. If complexity C is regarded as the effort needed to make a description of the system, then it can be measured as Shannon’s information entropy H. However in information terms, since the entropy of a given description depends on the suitability of the language or metric used for that specific description [19], stated that complexity equals the information needed assuming the language or metric used is the best suited for the description. In the examples we want to analyse, it is impossible to ensure that we are using the most suited language. Thus, we have to be content alternative estimates of Kolmogorov’s complexity KC using Shannon’s information entropy. Thus, we assume that the relationship is KC  H. Several authors reached the same conclusion using different paths of thought [20]. Complexity can also be regarded as the effort needed to describe the internal activity the system has to perform to keep itself structured and under control and, at least temporarily, achieve the objective of limiting the natural and inexorable tendency of its entropy to increase. Thus a way to asses this estimation of complexity is to contrast the Kolmogorov’s complexity KC of the system with the ‘‘distance’’ the system places itself from total disorder, which is 12KC. Quantitative empirical science is much better than other heuristic protocols, including pure theoretical analysis, in advancing detailed knowledge [21, 22]. Here, we explore the few examples we could find, where an empirical and quantitative approach to the study of synergy was possible allowing for quantitative estimates of the relevant thermodynamic variables.

METHODS We searched the literature for papers which presented quantitative thermodynamic data on systems that suffered a synergistic transformative process, or were the product of two processes with different synergistic inputs. For each article, we identified two different stable states with different levels of complexity. The selected articles had data on entropy or on information content for each of the systems that allowed us calculate differences in H, and data or references to data for total work output of the two stable states of the system. All data found was recalculated in terms of H as explained in detail in the Appendix. We compared H

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

TABLE 1 Impact of Synergy Estimated for Several Type of Systems using a Proxy Estimate of Free Energy or Useful Information to Perform Work (W ) and the Negentropy of the System (N) Example 1a 1b 2 3 4 5a 5b 6a 6b 6c 6d 7a 7b

Entropy Measure

W Measure of Work Output

X (w) Ratio W

X (N) Ratio Negentropy

X (w)/X (N)

Social Complexity in Myrmicinae ants Social Complexity in Attini ants Social complexity in aggregates Scientific development Division of labor Brain Complexity Brain Complexity Spanish text English text Spanish text English text Entropy in music Entropy in music

Efficiency in energy consumption Efficiency in energy consumption Exponent of energy efficiency function Economic development Economic efficiency Polymorphysm Log Colony size Readability Readability Nobel Prize/average Nobel Prize/average Popularity Number of instruments

1.70 2.00 2.50 2.60 2.96 2.70 3.00 1.05 0.93 >1 >1 >1 >1

2.00 2.20 2.60 3.10 3.00 4.00 4.00 1.14 1.06 1.14 1.06 1.04 1.10

0.85 0.91 0.96 0.84 0.99 0.68 0.75 0.92 0.88 2 2 2 2

Data is presented in unit-free scalars calculated from as the proportional change (X) in the system after the synergistic process divided by the value before this process.

before (H1 ) and after (H2 ) the synergistic process studied, using the ratio Omega (X), by estimating the difference between state 1 and state 2. This unit free scalar was calculated as XðH Þ5H1 =H2 . In some examples, the data was given as information content, which allowed us to compare the ratio for the total information content between the two states. If so, we calculated the scalar XðN Þ; which was the unit free estimate of negentropy so that XðN Þ  1=XðH Þ. Estimates of the amount of useful work, or useful information a system can generate, or estimates of the efficiency of a parameter that relates to the quantity or quality of the output, were calculated for each example as detailed in the Appendix. These values were used to calculate the ratio of total work output between the two states of the system by dividing the values associated with the lowest entropy in the final state after the synergistic process occurred (N2) by those associated with the highest entropy before the synergistic process started (N1). We called this ratio XðW Þ.

RESULTS To be able to compare completely different systems that use different units to measure W and N , we calculated the ratios of these variables before and after a synergistic transformation has occurred. We used studies for which we had detailed data to calculate W andN . We could identify seven completely different systems. For each of these systems we estimated the actual energy when possible, or if the data allowed it, the information

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

content associated with two different states of the system. These estimates lead us to calculate XðW Þ. For each of these two states, we estimated N , which allowed us to calculate XðN Þ. Details of how we calculated W and N using the published data in each case, are given in the Appendix. The systems studied were: 1. Social complexity of ant societies, measured as colony size and polymorphism in worker castes, and the minimal cost of its maintenance [23]. 2. Size of social aggregates of ant colonies and of human cities and their relation to social synergy measured as energy consumption (2010). 3. Quantitative data of the complexity of the scientific system of a nation and economic growth [24, 25]. 4. The working of division of labour and the invisible hand in simulations of virtual economies [9, 10] 5. The relationship between brain complexity and size of ant societies and their complexity regarding worker polymorphism [26]. 6. The entropy content of texts and their literary quality and readability in two languages [27, 28]. 7. The entropy content of pieces of classical music and their popularity achieved [29]. The Table 1 summarizes the measures in proxies of Gibbs free energy or work output of the system (W ), and those for the negentropy of the system (N ) described above. To be able to compare completely different systems that use different units to measure W and N , we calculated the ratios of these variables before and after a

C O M P L E X I T Y

3

synergistic transformation has occurred. Thus, the Table 1 depicts the ratio of W or N at the end of the process divided by those values before the process occurred. XðW Þ and X(N ) then are unit free values related to W and N . The ratio XðW Þ=XðN Þis a ratio of ratios, showing if XðN Þ changes more or less than XðW Þ.

CONCLUSIONS In all seven cases examined, when synergy was clearly known to be at work, the system suffered an increase in W in addition to an increase of N (decrease in information entropy of the system). In one unexpected case example this correlation was not found: For languages such as English and Spanish, the impact of information content on the output was a reduction of entropy making the system’s structure more orderly, creating the conditions that achieve higher aesthetic efficiencies. Here, the efficiency of work output if measured as readability, however, did not increase or even decreased with increases in negentropy (examples 6a and 6b). This result suggest that readability, as measured here, is a very poor measure of the quality of communication efficient and should be dropped as it does not reflect a more efficient performance of the language. Communicational efficiency associated to the writings of Nobel Prize winners, in contrast, did relate to negentropy (6c and 6d). This example shows that measures of W and N allow to differentiate between phenomena that are related to a synergistic effect and those that are not. The results suggest that synergistic processes are associated to a decrease in entropy (increase in negentropy) coupled to an increases in W . This feature can be used to define synergy. Alternatively, synergy can de defined as the opposite to dissipating processes such as combustion [30] that are associated with decreases in negentropy and decreases of free energy or W. Of course in many systems, changes in entropy as associated to synergistic processes may show a different pattern to the two extremes just mentioned, allowing thermodynamics to engage in a novel taxonomy of irreversible process in open systems. A completely unexpected finding was the fact that the increase in N was always higher than that of work output of free energy. The ratio of change in N /the ratio of change in W was always smaller than 1 (see Table 1), except when the working of synergy could not be invoked (examples 6a and 6b in the Table 1). This is consistent with our understanding of the second law of thermodynamics: if we assume that higher negentropy is due to greater complexity, which allows for increased potential work, then dissipation of energy must be associated with this relationship, explaining our results for XðW Þ=XðN Þ. This feature opens a route to quantify the amount of

4

C O M P L E X I T Y

energy or information dissipated in open irreversible systems.

APPENDIX OR COMPLEMENTARY INFORMATION 1. Social Complexity and Minimal Cost of Maintenance of Society (Data taken from Ref. 24). Living in society must have benefits for the inhabitants if societies are to be sustainable. That is, social synergy allows social organisms to be more efficient in resource use than nonsocial ones. Data measuring the efficiency of energy consumption in ant societies of different social complexity and size are available. The study used here finds that ant species with higher polymorphisms (i.e., larger number of morphological distinct worker castes) have a lower per capita energy requirement for workers. Specifically, among ants in the family Formicinae species with 1 worker caste consume on average about 1.1 more energy than ant species with 2 worker castes. Among Myrmicinae the relationship is 1.7 times, and among Attini (leaf-cutter ants) with Atta species having >2 castes, the proportion is at least 2. This gradation corresponds well with the difference in social complexity of monomorphic and polymorphic ant societies among these three groups of ants. If XðH Þ is calculated just using the number of castes (1 and 2), a great underestimation of real complexity of these ant societies, we have that Shannon entropy of the society decreases at least by half. That is, XðN Þ52; whereas the Omega XðW Þ for work output, as estimated for Myrmicinae’s increase in energy efficiency with increased number of castes was 1:7. 2. Size of Social Aggregates and Social Synergy (Data taken from Refs. 21, 22). Larger cities are more efficient in their use of energy than smaller ones. The exponent by which per capita consumption of electricity diminishes according to the number of inhabitants a city hosts varies from 20.47 in Brazil of 20.18 in Denmark. If we accept that heterogeneity in quality of services in Brazil is much higher than in Denmark, and that smaller rural towns have less services than larger metropolis in Brazil, we can assume the proportionality exponents is related to the difference in complexity of large and small cities. That is, in Brazil cities are 2.6 times more heterogeneous regarding complexity of services than those in Denmark. Among ant societies, these difference is at least 4.6 times. Here, we assume that this difference is a proxy of 1=H. 3. Science, Economics, and Social Synergy (Data taken from Refs. 24, 25). The complexity of an economy and of a scientific community can be calculated with great accuracy. However, both have to be calculated differently. Economic Complexity [31] Table A1 shows the relationship between different ways of calculating complexity of the knowledge of a society with Economic Complexity (ECI) as calculated by [31] and Economic Growth as calculated by the World Bank as Gross Domestic Product per capita

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

TABLE A1 Correlation Coefficients Between Scientific Publications/Capita; Science Complexity Index; Economic Complexity Index and Revealed Comparative Advantages of Neurociences with the Economic Complexity Index and Gross National Product per capita of nations in 2010 Spearman Correlations

ECI 2008

Publications/capita SCI 2010 ECI 2008 RCA 2010 Neuroscience

0.79 20.06 2 0.68

GDPc 2010 0.9 0.1 0.75 0.72

(GDPc). The knowledge of a society can be calculated as ECI, as academic publications per capita and as a scientific complexity index (SCI) calculated in the same way as ECI but using academic publications as output rather than exports of products as in ECI [24]. Another index that showed to be related to the sophistication of a scientific community is the Revealed Comparative Advantages or RCA [32]. Specifically RCS in Neuroscience [25]. Table A1 shows that the index for knowledge complexity that best correlate with economic growth is Publications/capita. The log of this value as reported in the mentioned paper ranged between 22.5 and 0.6 (XðN Þ53:1), whereas the log of GDPc ranged between 2.4 and 5.0 (XðW Þ52:6). That is the ratio was > 2 in both cases. We can thus relate XðW Þ estimated as the ratio of the extremes in GDP, and XðH Þ as the ratio of the extremes of scientific complexity or complexity of knowledge managed by the society. 4. Division of Labor and the Invisible Hand [9, 11]. In simulations of economic markets exploited by societies of agents using Sociodynamica, the amount of resources a society without division of labor could accumulate was 8 units, compared with 63 units accumulated by societies with division of labor into 3 discrete categories (miners, farmers, and traders). Thus, an XðN Þ53 achieved eco-

nomic gains of 7.8 times, which makes. XðW Þ5log 63=log8 5 2:96. This result was used as a clear demonstration that division of labor allows for synergistic interactions of agents, which allow economic markets to work efficiently, triggering economic growth. This example is considered as a classical example of the working of synergy. 5. Brain Complexity and Social Synergy [26]. Brain complexity might be thought is an adaptive feature to allow for complex cognitive task in their bearers. If more complex societies require more complex tasks from their participating individuals, we would expect a relationship between bran complexity and complexity of the society the individuals form. Even a synergistic relationship might exist. Data from ant societies however reveal a different result (Table A2) In some cases, as the overall surface of a brain structure called the Calyx, This relationship is concave with a maximum value in species with an intermediate level of social complexity such as Acromyrmex octospinosus. In the most socially complex society studied, Atta laevigata, individuals showed a lower index of overall surface of the calyx. This relationship might suggest that socially very complex societies require cognitive less developed individuals. The asymmetry between the two calyxes, however, was related positively and continually with social complexity. This parameters is related to the complexity of the brain rather than to its total capacity. We choose this last index to make XðN Þ estimates. The rate XðN Þ between the minimum and maximum value of this index was 16 or in entropy scalars log16 5 4. Estimates of work output XðW Þ were two. The degree of worker caste polymorphism, and the estimated size of the average colony in a log scale. Each of these two different estimates of social complexity are related to the complexity of the information content the society has to manage. The ratio XðW Þ between maximum and minimum vales for each of them was 2:7 and 3, respectively 6. Synergies and Language [27, 28]. Languages can be regarded as systems formed by large number of symbols and rules. For natural languages, even though there are

TABLE A2 Various Indicators of Brain Complexity for Workers of Different Ant Species Showing a Variety of Social Complexity

Ant Species Odontomachus bauri Trachymyrmex urichi Acromyrmex rugosus Acromyrmex octospinosus Atta laevigata

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

Asymmetry Between Internal and External Calyxes

Total Surface of Folding of the Calyx

Social Complexity Estimated with Polymorphism

Estimated Average Colony Size

2 7 13 23 32

80 88 100 128 90

1.1 1.2 1.5 2 3

500 1000 5000 100,000 500,000

C O M P L E X I T Y

5

TABLE A3 Entropy H, Spanish Perspicuity Index IPSZ and English Readability Index RES for Groups of Speeches Written by Nobel Laureates and Non-Nobel Writers Group Language Literature Nobel Laureates Spanish (19 speeches) English (37 speeches) Non-Nobel Spanish (117 speeches) English (101 speeches)

H

1-Hrel

IPSZ

0.830 0.788

0.176 0.163

63.4

0.852 0.838

0.155 0.154

60.6

RES

56.9

61.0

other options, words can play the role of symbols. As a message written in a natural language is more organized, the entropy of the message reduces making it prone for a better understanding and therefore more effective as an information transmission process. We measured the entropy based on the word frequency of speeches written in English and Spanish. Assuming that those writers who won the literature Nobel Prize are masters of the use of natural languages, we compared the entropy computed for their speeches with the entropy computed for the speeches of non-Nobel writers. We also considered the readability index RES [33] of English texts as possible reflex of synergy. For Spanish texts we evaluated the readability using the perspicuity index IPSZ [34]. The result of this figures is shown in Table A3. There is a clear difference between the entropy of the Nobel laureate speeches and the non-Nobel speeches. Considering the entropy as a function of speech length, there is band where English and Spanish speeches, tend to lay. We take to middle line of the band as a referential entropy value, which allows us to compare the entropy of

any speech, in spite of the different lengths, measured in words. We refer to the difference between the entropy of a speech and the average of the entropy of the speeches asHref . We used the value 12Href of as a proxy of negentropy, and thus included them in Table A3. These proxy of negentropy values suggest that the more effective structure reached by master writers can be recognized by the ratio of negentropies. This behavior is consistent for Spanish and English. In English, the XðN Þ is about 1:06 and in Spanish about 1:14. However, the change of readability index between Nobel writers and non-Nobel does not follow the same direction in Spanish and English; while in Spanish the presumably better use of the written language performed by Nobel laureates could be associated with a higher IPSZ, XðW Þ51:05. The same change does not show for RES index in English which showed a XðW Þ based on RES of 0:93. 7. Music and Entropy [29]. Why we are delighted by music remains a mystery and an old subject for discussion. Perhaps it is due to the sound patterns we find in it, that we like music. If so, then the degree of understanding—predictability—of those patterns should be a source of pleasure and therefore the organization of the information perceived as music should be related to the degree of success of any musical piece. Obviously, there are factors as personal experience and culture, which have an important influence over the process of interpreting a sequence of sounds. We may or may not like some specific type of music; but this only supports our belief that pleasure, especially in the case of music, comes after certain degree of predictability of the sounds. This is not to oversimplify the phenomenon of pleasure and its causes; in fact sometimes pleasure comes after the surprise of hearing a different path of sounds and contrasting it with the path we expected before actually listening the piece. But in any case, we think the popularity of a music piece results from the way the composer organized the sound patterns. For a

TABLE A4 Comparison of Entropy for MIDI Music of Baroque and Classical Periods and Some of the Most Recognized Pieces in the Same Period Z Composer Vivaldi Bach Handel Beethoven Mozart 6 composers 5 composers

6

C O M P L E X I T Y

Period

Piece

H

Baroque Baroque Baroque Classic Classic Baroque Classic

Four Stations.Spring TOCC&FUG.Organ Hal lelujah Symph.9_4 Eine Kleine Nachtmusik K525 55 pieces 45 pieces

0.6025 0.5372 0.4897 0.4658 0.5865

Average H

Average H

0.5431 0.5346 0.5262 0.5714 0.5553

0.56335

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

TABLE A5 Comparison of Entropy for Two Pieces of MIDI Music Interpreted by Different Number of Instruments Composer

Period

RIMSKY.KORSAKOV

Romantic

RACHMANINOV

20th Century

Piece

Instrument

H

FlyOfBumblebee.Orchestra FlyOfBumblebee.Piano1 FlyOfBumblebee.Piano2 FlyOfBumblebee.Piano3 PianoConcerto.Nro2 PianoConcerto.Nro2

about 10 piano solo piano solo piano solo about 10 piano solo

0.5530 0.7154 0.5972 0.6724 0.5166 0.5291

large number of symbols—sounds is this case—more understandable patterns are those which the sounds are arranged to lower the entropy of the whole set of symbols. For the music most people like, there is a relatively low entropy and therefore low information structure that we can recall and also build a clearer map of expectations for the upcoming sounds. We have selected some of the most recognized music pieces from the Baroque and Classical periods. We compare the entropy of these pieces with the averages of the entropy for many music pieces in the same periods. Results are presented in Tables A4 and A5, showing that the value of XðN Þ in this case is1:04. Vaclav Smil [35] evidenced how civilization and culture arose thanks to the control and more effective use of greater sources of available energy along the eras of humankind. In a similar way, though much smaller scale, we think music may represent the same phenomenon when multiple instruments have to be played simultaneously to produce a ‘‘better’’ sound by ‘‘filling’’ specific har-

monics and, in spite of the increasing control effort needed, create a more enjoyable sequence of sounds represented by a generally lower entropy. H changed according to the estimated number of instruments used for recordings. When comparing the minimum (a single instrument) with the maximum (orchestra) we find a tendency for the larger number of instruments to cause an entropy reduction, suggesting an increase of structure for the musical pieces interpreted by a larger number of instruments. We think this might be the result of the proper use of the additional degrees of freedom introduced by the addition of instruments. We shall highlight the fact that adding instruments increases the difficulty of keeping harmonies and synchronization in polyphonic music, approaching a point in which the risk of losing structure and beauty of the sound is an important factor in establishing the limits of the size of the orchestra. Considering the two cases here analyzed we estimate a XðN Þ ranging from 1:02 to 1:20 (average 1:1)

REFERENCES 1. Pernu, T. K.; Annila, A. Natural emergence. Complexity 2012, 17, 44–47. 2. Camazine, S.; Deneubourg, L. L.; Franks, N. R.; Sneyd, J.; Theraulaz, G.; Bonabeau, E. Self Organization in Biological Systems; Princeton University Press, 2003; pp 538. 3. Annila, A.; Kuismanen, E. Natural hierarchy emerges from energy dispersal. Biosystems 2009, 227–233. 4. Chatterjee, A. Thermodynamics of action and organization in a system. Complexity, 2015. Doi: 10.1002/cplx.21744 5. Lucia, U. (2013). Stationary open systems: A brief review on contemporary theories on irreversibility. Physica A: Statistical Mechanics and its Applications, 392(5), 1051–1062. 6. Sherrington, C.S. The integrative action of the nervous system; Sherrington, C. S. (1916). The integrative action of the nervous system. CUP Archive. 7. Haken, H. Synergetics, an introduction; Berlin: Springer, 2004. 8. Corning, P.A. The Synergism Hypothesis: A Theory of Progressive Evolution; McGraw-Hill: New York, 1983. 9. Jaffe, K. Visualizing the Invisible Hand of Markets: Simulating complex dynamic economic interactions. Intell Syst Account Finance Manag 2015, 22, 132-115. 10. Jaffe K. Extended Inclusive Fitness Theory bridges Economics and Biology through a common understanding of Social Synergy. ArXiv 2015. 11. Georgiev, G. Y.; Henry, K.; Bates, T.; Gombos, E.; Casey, A.; Daly, M.; Vinod, A.; Lee, H. Mechanism of Organization Increase in Complex Systems. Complexity 2015, 21, 18–28. 12. Friston, K. A Free Energy Principle for Biological Systems. Entropy 2012, 14, 2100–2121. ry, E. ‘Synergistic selection’: A Darwinian frame for the evolution of complexity. J Theor Biol 2015, 13. Corning, P. A.; Szathma 371, 45–58. 14. Gershenson, C.; Fernandez, N. Complexity and Information: Measuring Emergence, Self-organization, and Homeostasis at Multiple Scales. Complexity 2012, 18, DOI 1002/cplx.21424

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

C O M P L E X I T Y

7

15. 16. 17. 18. 19. 20.

21. 22. 23. 24. 25.

26. 27. 28. 29. 30. 31.

32.

33. 34. 35. 36. 37. 38. 39.

8

Schwarz, G. E. Estimating the dimension of a model. Ann Stat 1978, 6, 461–464. Jaffe, K. Negentropy and the evolution of chemical recruitment in ants. J Theor Biol 1984, 106, 587–604. Adami, C. What is information? Philos Trans R Soc A 2016, 374, 20150230. Shannon, C. E. A Mathematical Theory of Communication. Bell Syst Tech J 1948, 27, 379–423, 623–656. Kolmogorov, A. Three Approaches to the Quantitative Definition of Information, Problems Inform. Transmission 1965, 1, 1–7. € nwald, P.D., Vita nyi, P.M.B. (2003) Kolmogorov Complexity and Information Theory: With an Interpretation in Terms of Gru Questions and Answers. Journal of Logic, Language, and Information, Vol. 12, No. 4, Special Issue on Connecting the Different Faces of Information (2003), p. 497 http://www.jstor.org/stable/40167357 Jaffe, K What is Science, An interdisciplinary view; Amazon books, Caracas, 2010. Jaffe, K. Quantifying social synergy in insect and human societies. Behav Ecol Sociobiol 2010, 64, 1721–1724. Jaffe, K.; Hebling-Beraldo, M. J. Oxygen consumption and the evolution of order: Negentropy criteria applied to the evolution of ants. Experientia 1993, 49, 587–592. Jaffe, K.; Rios, A.; Florez, A. Statistics shows that economic prosperity needs both high scientific productivity and complex technological knowledge, but in different ways. Interciencia 2013, 38, 150–156. Jaffe, K.; Caicedo, M.; Manzanares, M.; Rios, A.; Florez, A.; Montoreano, C.; Davila, V. Productivity in physical and chemical science predicts the future economic growth of developing countries better than other indices for knowledge. PloS One 2013, 0066239. Jaffe, K.; Perez, E. A comparative study of brain morphology in ants. Brain Behav E 1989, 33, 25–33. Febres, G.; Jaffe, K.; Gershenson, C. Complexity measures of natural and artificial languages. Complexity 2014, 20, 429–453. doi:10.1002/cplx.21529. Febres, G.; Jaffe, K. Quantifying structure differences in literature using symbolic diversity and entropy criteria. J Quant Linguistics, 2016. Febres G., Jaffe K. Music viewed by its Entropy content: A novel window for comparative analysis. ArXiv 2016. Carnot, S. Reflections on the Motive Power of Heat; Robert Henry, T., Ed. and translator; Wiley: New York, 1824. Hausmann, R.; Hidalgo, C.A.; Bustos, S.; Coscia, M.; Chung, S.; Jimenez, J.; Simoes, A.; Yıldırım, M.A. The Atlas of Economic Complexity: Mapping paths to prosperity. Hausmann, R., Hidalgo, C. A., Bustos, S., Coscia, M., Simoes, A., & Yildirim, M. A. (2014). The atlas of economic complexity: Mapping paths to prosperity. Mit Press. Laursen, K. Revealed Comparative Advantage and the Alternatives as Measures of International Specialization. DRUID Working Papers 98-30, DRUID, Copenhagen Business School, Department of Industrial Economics and Strategy/Aalborg University, 1998. Flesch R. How to test readability; Harpe & Brothers: New York, 1951. rmula de perspicuidad. Universidad CompluSzigriszt-Pazos, F. Sistemas Predictivos de legibilidad del Mensaje escrito; Fo tense: Madrid, Spain, 1993. Smil, V. World History and Energy; Encyclopedia Energy 2004, 6. Corning, P. A.; Kline, S. J. Thermodynamics, Information and Life Revisited, Part II: Thermoeconomics and Control Information. Syst Res Behav Sci 1998, 15, 453–482. Corning, P. A.; Kline, S. J. Thermodynamics, Information and Life Revisited, Part I: To Be or Entropy. Syst Res Behav Sci 1998, 15, 273–295. Corning, P. A. Control Information Theory: The ‘‘Missing Link’’ in the Science of Cybernetics. Syst Res Behav Sci 2007, 24, 297–311. Lopez-Ruiz, R.; Mancini, H. L.; Calbet, X. A statistical measure of complexity. Phys Lett A 1995, 209, 321–326.

C O M P L E X I T Y

Q 2016 Wiley Periodicals, Inc. DOI 10.1002/cplx

Suggest Documents