The role of entropy in biological system

Dr.Vishnu
Dr.Vishnu
Joined: 22 Nov 05
Posts: 38
Credit: 337775
RAC: 0
Topic 192106

Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.

tullio
tullio
Joined: 22 Jan 05
Posts: 2118
Credit: 61407735
RAC: 0

The role of entropy in biological system

Quote:
Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.


This has been debated by E.Schroedinger in his book "What is life?" (1948) which gave rise to molecular biology and, subsequently, by I.Prigogine and his school in Bruxelles. I don't think you can say much more. Both men earned a Nobel prize.
This if you think ef thermodynamic entropy. There is, however, another entropy which is related to information theory and was introduced by C.E.Shannon in his seminal article "The mathematical theory of communication", also written in 1948.
The use of the term "entropy" was suggested to him by John von Neumann. To which entropy do you refer?
Tullio

gravitonring
gravitonring
Joined: 19 Oct 06
Posts: 170
Credit: 8508
RAC: 0

are these questions or

Message 53237 in response to message 53236

are these questions or subjects related to the topic:

could there be a particle, or energy, or completely nonphysical aspect to life?
there is a chemical folding in non DNA, is this, or particle entanglement life?
http://www.math.uni-hamburg.de/home/gunesch/Entropy/bio.html
Entropy in the Biological Sciences
http://www.endeav.org/evolut/entropy/index.htm
On Thermodynamics, Entropy and Evolution of Biological Systems:
What is Life from a Physical Chemist's Viewpoint

http://en.wikipedia.org/wiki/Mimivirus

Quote:
Because its lineage is very old and could have emerged prior to cellular organisms, mimivirus has added to the debate over the origins of life. Some genes unique to mimivirus, including those coding for the capsid, have been conserved in a variety of viruses which infect organisms from all domains - Eukaryota, Archaea and Prokaryota. This has been used to suggest that mimivirus is related to a type of DNA virus that emerged before cellular organisms and played a key role in the development of all life on Earth[6]. An alternative hypothesis is that there were three distinct types of DNA viruses that were involved in generating the three known domains of life [7].

everything is true, the opposite of everything is also true

Dr.Vishnu
Dr.Vishnu
Joined: 22 Nov 05
Posts: 38
Credit: 337775
RAC: 0

RE: RE: Will readers like

Message 53238 in response to message 53236

Quote:
Quote:
Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.

This has been debated by E.Schroedinger in his book "What is life?" (1948) which gave rise to molecular biology and, subsequently, by I.Prigogine and his school in Bruxelles. I don't think you can say much more. Both men earned a Nobel prize.
This if you think ef thermodynamic entropy. There is, however, another entropy which is related to information theory and was introduced by C.E.Shannon in his seminal article "The mathematical theory of communication", also written in 1948.
The use of the term "entropy" was suggested to him by John von Neumann. To which entropy do you refer?
Tullio


Dr.Vishnu
Dr.Vishnu
Joined: 22 Nov 05
Posts: 38
Credit: 337775
RAC: 0

RE: RE: Will readers like

Message 53239 in response to message 53236

Quote:
Quote:
Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.

This has been debated by E.Schroedinger in his book "What is life?" (1948) which gave rise to molecular biology and, subsequently, by I.Prigogine and his school in Bruxelles. I don't think you can say much more. Both men earned a Nobel prize.
This if you think ef thermodynamic entropy. There is, however, another entropy which is related to information theory and was introduced by C.E.Shannon in his seminal article "The mathematical theory of communication", also written in 1948.
The use of the term "entropy" was suggested to him by John von Neumann. To which entropy do you refer?
Tullio


Dr.Vishnu
Dr.Vishnu
Joined: 22 Nov 05
Posts: 38
Credit: 337775
RAC: 0

RE: RE: Will readers like

Message 53240 in response to message 53236

Quote:
Quote:
Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.

This has been debated by E.Schroedinger in his book "What is life?" (1948) which gave rise to molecular biology and, subsequently, by I.Prigogine and his school in Bruxelles. I don't think you can say much more. Both men earned a Nobel prize.
This if you think ef thermodynamic entropy. There is, however, another entropy which is related to information theory and was introduced by C.E.Shannon in his seminal article "The mathematical theory of communication", also written in 1948.
The use of the term "entropy" was suggested to him by John von Neumann. To which entropy do you refer?
Tullio


Dear Sir thanks a lot, interest has been created please follow through if I can place the whole article for readers to analyze and strt the discussion.

Dr.Vishnu
Dr.Vishnu
Joined: 22 Nov 05
Posts: 38
Credit: 337775
RAC: 0

RE: RE: RE: Will

Message 53241 in response to message 53240

Quote:
Quote:
Quote:
Will readers like to participae in a philosophical dicussion about the above well written topic which explains all the posibility of fringe medicine and complementary medicine actions of human system, with the present system of logic available for use.

This has been debated by E.Schroedinger in his book "What is life?" (1948) which gave rise to molecular biology and, subsequently, by I.Prigogine and his school in Bruxelles. I don't think you can say much more. Both men earned a Nobel prize.
This if you think ef thermodynamic entropy. There is, however, another entropy which is related to information theory and was introduced by C.E.Shannon in his seminal article "The mathematical theory of communication", also written in 1948.
The use of the term "entropy" was suggested to him by John von Neumann. To which entropy do you refer?
Tullio

Dear Sir thanks a lot, interest has been created please follow through if I can place the whole article for readers to analyze and strt the discussion.

Role of Entropy in Biological Systems

Eduardo Alfredo Zevallos-Giampietri, M. D.
Carlos Barrionuevo, M.D.
Departamento de Patología, Instituto de Enfermedades Neoplasicas, Lima, Perú

Vishnu S. Shukla, M. D.
Phoenix Cancer Help Group, Green Acres, ITC, Bangalore, India

Address correspondence:
Eduardo A. Zevallos Giampietri, M.D.
Clínica San Marcos & Instituto Diagnóstico Cayetano Heredia
Jr. A. B. Leguía 604
Tarapoto, San Martín
Perú
Telefax: + 51 42 523838
E-mail:

Running head: Entropy, Biology, Health, Disease, Life Evolution.
Abstract

In this article the authors use an extended concept of Entropy to understand biological systems. Entropy is considered as a universal function that can be applied in different scenarios, from the Planck level to the Cosmological level. Establishing the universal importance of Entropy is crucial to understand the behavior of biological systems. Hence, concepts such as life, evolution, reproduction, health and disease then acquired a profound significance. Interpretation of data depends on the relativistic position of the observer, either internal or external to the system. Entropy is highest (Smax) or lowest (Smin) for an external or internal observer, respectively. This is corresponding to the “outside view� and “inside view� that emerged in the context of quantum mechanics. If this essential fact is not taken into account then fallacies are inevitable, and subsequently, weird interpretations and twisted believes take place in any types of scenarios. In addition, the authors explain the concept of intrinsic or physiological time based on Smax/Smin associated to cybernetics concepts.

Key words: biology, death, disease, dissipation, evolution, health, information, life, maximum entropy, second law, thermodynamics.

Introduction

Medicine is not an independent discipline, that it can be placed into the context of well-established physical laws and scientific principles. Particularly, the Second Law of Thermodynamics (1), Popper’s falsification principle (2), and Cybernetics are very suitable for this purpose. Here, Entropy is treated as a broad physical principle. Entropy is one of the most controversial concepts ever described. Clausius, who discovered this property of nature in 1865, used this term (in German “entrepein�, and from Greek entrope, change) to denote conversion of energy. All types of energy potentially can be interconverted among themselves. The discovery of Clausius was based on the Carnot´s principle that no machine can be perfectly efficient. In the process of energies interconversions part of the total energy cannot be reused. Despite the conservation of energy, this apparently unrecoverable share of energy is integrated into the rest of the universe in the form of heat. Meanwhile, and simultaneously, the temperature of the universe decreases as the universe expands. The Second Law is like a security device for the First Law, and it reassures that equilibrium (maximum entropy) will be reached (Third Law). In fact Clausius concisely summarized the laws of thermodynamics, as “the energy of the universe is constant, while the entropy of the universe tends to be maximal�. The Second Law of Thermodynamics is one of the most fundamental principles; it says that when a type of energy converts into another inside a system, there must be always some amount of energy that is transformed into heat, and it dissipates into the rest of this universe. This may reflect a basic mechanism that guarantees consistency and historical conservation in the universe. Therefore, as there is equivalence between energy and mass through the Special Theory of Relativity, then all the mass in the universe is dissipating. Unrecoverable heat can be cautiously considered to as “useless� energy. On the other hand, the energy that produces an effective work can be considered as “useful� energy (at the very moment that such work is being done). Entropy is a measurement of the interrelations inside the system, which determines the outcome of heat and work. Entropy is a measurement of the uncertainty of destination of energy. Heat represents that part of information that is lost; while work represents the information that is preserved (3).

Entropy can be conceptualized as a broad principle in the development of systems. Clausius’ (thermodynamic) Entropy, Boltzmann’s (probabilistic) Entropy, and Shannon’s (information) Entropy are all related, and they provide a basic framework to understand Entropy at different levels, including Planck, Quantum, Newtonian, Relativistic, and Cosmological. Since the macroscopic physical “reality� emerges from the quantum level, therefore Heisenberg’s Uncertainty Principle (4) can be applied to the study of systems, and ultimately it may be the basis of evolution. A crucial relationship between Shannon’s Entropy and Cybernetics was established by Ashby’s Law of Requisite Variety: “The law of Requisite Variety says that R’s capacity as a regulator cannot exceed R’s capacity as a channel of communication…the law of Requisite variety can be shown in exact relation to Shannon’s Theorem 10, which says that if noise appears in a message, the amount of information that can be removed by a correction channel is limited to the amount of information that can be carried by that channel…Thus the use of a regulator to achieve homeostasis and the use of a correction channel to suppress noise are homologous� (5). We think that all types of definitions in the context of complex systems must be avoided, because definitions inevitably fall into anthropomorphic or reductionist fallacies.

Medicine is not a science, as it lacks a theory, and so far cannot be expressed in logical true values. This weakness is justified among physicians by appealing to “inevitable� subjective factors. However, there is no intellectual effort in medical circles to explain as to on what these subjective factors consist of, or how do they arise. For the same reason the health activity, including facilities and providers, are close circles and have “invented� themselves usually non-scientific terminology and doubtful formalities. Moreover, the attempts of finding “hidden philosophies� by physicians have been so far worthless, because they are restricted to the rules imposed in such close circles. The health system, in this sense, projects this inadequate approach towards conventional medical schools. Therefore, physicians are mostly unaware of this lack of “philosophy�, or they consider with resignation that this deficiency is unsolvable. So-called basic sciences such as Biochemistry and Physiology are also unable to unveil such “hidden philosophies� since they are also influenced by the prevailing anthropomorphic orientation. Contrarily and paradoxically, non-conventional medicine has more scientific basis because it has a dynamical non-linear scope, as it usually focuses on body and mind in a comprehensive manner (holistic). Health and disease, to the best of our knowledge, have not been fully associated with physics or mathematics. The question of internal observer and external observer also come in play with Yoga, as well as subjectivity and objectivity of modern medicine (6). Therefore, we go beyond and conjecture a model of health and disease based on Entropy.

In this article Entropy is placed into a biological perspective, and its importance is extended to health and disease. This is important for the perspective of Entropy as a probabilistic measurement in diagnostic process (manuscript in preparation).

Biological systems

Thermodynamically, a system can be described by its state, which includes the overall properties of the system such as pressure, volume, and temperature as well as composition. It is not a coincidence that Thermodynamics is also known as Statistical Mechanics, since the principles can be obtained based on physics as well as in a probabilistic manner (3).

According to the interaction with the environment systems can be separated as follows:
1. Systems in equilibrium with the medium
Æ’ = 0
Si = Sext (> 0)
- “Energy� only flows
- Organic polymerization not significant
2. Systems far from equilibrium with the medium
Æ’ > 0
Si 0)
- “Energy� produces a function (heat, mechanical and chemical work)
- Organic polymerization significant (proteins, nucleic acids, etc).

Where,
Æ’ is the function of the system,
Si the internal Entropy, and
Sext the external (environmental) Entropy.

Essentially, this is an arbitrary thermodynamical division of systems to highlight the behavior of Entropy at both extremes: close and open systems. Ultimately, in all systems inside the universe there must be some sorts of matter and/or energy flux, thus stringently and broadly speaking all systems are truly open. Therefore, classical thermodynamics, dealing with “isolated� systems, is unrealistic. Biological systems are at extreme far from equilibrium status, which renders them a false sense of dislodgement from the media. The anthropic principle may be rooted in this fallacy. In open systems, such as biological, the informational or working energy can be replenished allowing the flow of “new� energy to produce more informational or working energy. In consequence, it is wrong to say that the Entropy of an open system must always increase.

We shall generalize the concept of Entropy as “the development of the elements of the systems�. This is not a strict definition. Notice that terms such as energy and time are not used. Besides, the term element has to be taken in its broadest meaning, since the elements determine the function, as well as vice verse the function determines the elements. For the same reason we do not used a term such as “interaction� (of elements), precisely to avoid separation of elements from the function. At a quantum level, similarly, there is no difference between particles and waves. Moreover, it is likely that at the Planck-quantum interface this “smearing effect� is even blurrier. Entropy deals with the development towards equilibrium of the overall elements and sets of elements. Time is not primordial, since equilibrium inevitably has to be reach. This concept regards Entropy as “development�, therefore, implies a dynamical and measurable process similar to Information. Because of this “development�, Entropy has also a connotation of measurement of freedom. One bit of information means the probability between two states, but also the same bit allows choosing between one of the two states. This freedom is rooted in quantum mechanics as uncertainty and non-determinism, and reflected in the macroscopic world as evolution.

"While some parts of the universe may operate like machines, these are closed systems, and closed systems, at best, form only a small part of the physical universe. Most phenomena of interest to us are, in fact, open systems, exchanging energy or matter (and one might add, information) with their environment. Surely biological and social systems are open, which means that the attempt to understand them in mechanistic terms is doomed to failure� (7). Biological systems function at their own level of “relatively Smax� (eigen, proper, intrinsic or physiological entropy). This is an extension of Jaynes’s Principle of Maximum Entropy (8), which for an external observer appears to be the most frequent state of a system, meaning more Entropy, and therefore more uncertainty. However different combinations can produce the same state or result (9 – 12). Jaynes’ principle of Maximum Entropy has the advantage that maximizes the uncertainty of the observer; therefore this observer cannot have more information than is actually available. Though, this intrinsic or “eigen� entropy is lower compared to that of the environment. An inherent condition of biological systems is that energy flows through their constituents (cells, organelles, molecules, etc.). The role of the metabolic work is, in fact, the reduction of the internal Entropy compared with that of the environment. This is essentially the principle of what is life.

Time, after all, is an interim term, and its measurement is relativistic (13). Each system in the universe carries its own intrinsic time (eigen time) (14), likely implicit in the system’s own intrinsic Entropy (eigen Entropy). Systems are at dynamical status between two extremes. On one side they tend towards their own Smax (to say the maximum “allowed� eigen entropy). Meanwhile, on the other side, they try to avert the surrounding Entropy (Suniverse). To persist as such, the system has to stay away from both extremes. If it reaches the exact status of Smax, the system collapses. However, if the system cannot sustain the tendency towards Smax, then it starts to merge with the rest of the universe, a process known as dissipation (7). At both extreme situations systems cease to exist. All systems tend to their Smax, and so do the biological systems. Since biological systems are extremely complex, this dynamical “struggle� between the Smax and Suniverse is not well understood. The more complex is a system, the more words and more complex models are necessary to explain the system; therefore it is easier to produce paradoxes and fallacies. In fact, a full explanation of a complex system can be accomplished only reproducing exactly the same system. “Explicit definitions for central concepts concerning complexly organized systems are often not just impossible to provide, but can be quite misleading. The reason is simple: Explicit definitions place the defined term on only one side of the definition, so that all explicitly defined concepts are in principle eliminable� (15). Our reasoning is akin to Ashby’s concept of regulation of very large systems; in fact this scientist says, “Then he [Sir Ronald Fisher] showed that any given extraction of information had a maximum, and the statistician’s duty was simply to get near the maximum- beyond that no man could go. Similarly before Shannon’s work it was though that any channel, with a little more skill, could be modified to carry a little more information. He showed that the engineer’s duty is to get reasonably near the maximum, for beyond it no-one can go. The law of Requisite Variety enforces a similar strategy on the would-be regulator and controller: he should try to get near his maximum - beyond that he cannot go.� (5). Moreover, complex systems naturally produce statements lacking supporting theorems inside them (16, 17). This fact may be related to the unavoidable increasing Entropy inside systems, which can be also considered as error accumulation.

The “struggle� Smax/Suniverse determines all the complex biological functions, which in turn maintain integrity and function. This is indeed the thermodynamic basis of life. Life frequently is referred as to something ethereous. Life, indeed, is the overall thermodynamical expression of high carbon-based complex systems that are called organisms, including the human species. Biological systems are governed by the same universal thermodynamical principles that apply equally to every system of this universe. Similarly to any other systems inside this universe, biological systems are constantly changing, as well as transforming themselves into non-informational energy (heat). This is essentially the core of the dissipation process (7). Not only biological systems have a dissipative structure, but also everything inside the universe, and possibly the universe itself. A universal “dissipation principle� can be conjectured, which perhaps is in correspondence with Thermodynamics and Information theory. Gravitation itself might be a macro-aspect of Smax, acting as a physical tendency to reduce the Entropy of the systems (18). Suniverse, on the other hand, may be driven by the dark energy (or by the boundary of the universe), as a physical tendency towards dissipation or “antigravitation�. Very speculatively speaking, in analogy to the elusive “graviton�, there may be “particles� associated to Information (to say the “informon�), and even associated to observation (to say the “observon�, or the “voyeuron�). Neutrinos and/or neutralinos may play an important role in the process of information.

In biological systems, an increased of Entropy is somewhat undesirable or unpleasant, but unavoidable. For instance, when we are hungry, an unpleasant sensation, it signifies that our body is tending to high Entropy and dissipating. The food is made up of elements that are either at lower or not too much higher Entropy compare to that of our own body’s Entropy. Indeed, metabolism has a functional specificity to decrease Entropy, since enzymes are highly efficient factories of low Entropy (19). Metabolism, potentially, can be translated into informatics. Enzymes are essentially polymers. Polymers (macromolecules) have “self-assembly� property according to their sequence and/or interaction with the media. This property is a stochastic process that can be traced back to origin of planet Earth. A similar extension of this principle is that of Information Gathering and Using System (IGUS), which in the case of proteins the amino-acid sequence makes measurements while is growing to “attain a high unique stable native form that promotes the updating of the information content�, which means that proteins tend to be at equilibrium or at their Smax. Through IGUS proteins are comparable to a Maxwell Demon-like activity as they reduce the Entropy, but different than the Demon they cannot violate the Second Law (20). Melkikh uses a similar cybernetic approach regarding DNA and quantum behavior (21); however, we think he falls into a reductionist trap because DNA is not necessarily the primordial polymer that characterizes the function “life�. It would have been more interesting if this author performed a probabilistic analysis relating DNA, proteins and species. We conjecture that “packing� or “quantatizing� is a basic universal principle, which counteracts variability. In fact, as proposed by Everett, further elaborated by Schmidt and Zeh (22), for the outcome of our “reality� there is “no collapse� of the wave of the universe, but instead branching, and consequently development of multiple realities (multiverse). The more stable or robust configurations prevail (but we cannot ascertain which are the more stable universes). We see such “branching� of the wave as an averaging process that reduces variability. In fact this averaging can be in compliance with Cybernetics principles such Ashby’s Law of Requisite Variety (5), and the Principle of Selective Retention or “More stable configurations are less easily eliminated than less stable ones� (23). A relation between Cybernetics and Chaos Theory can be traced back to the Ashby’s concept of Markovian machine and his analysis of Stability when he says, “The stable region is a set of sates such that once the representative point has entered a state in the set it can never leave this set…A state of equilibrium is simply the region shrunk to a single sate. Just as, in the determinate system, all machines started in a basin will come to a state of equilibrium, if one exists, so too do the Markovian; and the state of equilibrium is sometimes called an absorbing state…S is now well known, a system around a state of equilibrium behaves as if ‘goal-seeking’, the state being the goal…Thus, the objective properties of getting success by trial and error are shown when a Markovian machine moves to a state of equilibrium� (5). Here we can see the analogy between Ashby’s “stable region� concept and that of “attractor� of Chaos Theory.

On the other hand, pleasant sensations are associated with the reduction of Entropy from a relatively previous state of increased Entropy. Retaking the previous example, an unpleasant sensation such as hunger signifies increasing Entropy. Eating reduces the pre-increased Entropy, and as a pleasant sensation the amount of pleasure is directly proportional to the previous degree of pre-increased Entropy. However, if we are not hungry eating is not pleasant, or it is even unpleasant. Perhaps in anorexia nervosa there is a corruption of this entropic mechanism at hypothalamic level. The behavior of animals, including human species, can be governed by the same principles. For instance, sexual intercourse renders great pleasure because it is intimately linked to reproduction, which is one of the most remarkable way to reduced Entropy, therefore this mechanism of gratification ensures perpetuation of the species. If Entropy is not already increased the system is thermodynamically indifferent, as there is nothing to decrease and there are no disturbances to constrain (24). Therefore, Entropy may play an important role in the internal homeostasis.

A fetus is a very pictorial example of a biological system. It cannot be at equilibrium because it is constantly growing (dividing cells, accommodating cells and making tissues). It acquires energy from the mother’s blood through the placenta. This energy is carried by nutrients that cross the placenta into the baby’s blood flow. Thermodynamically speaking, to increase its efficiency, the system “baby� has to be far from equilibrium, and then energy can flow more efficiently inside the system “baby�. Energy is used to decrease and maintain the condition “far from equilibrium�. This condition means that the system “baby� has less Entropy than the system “mother�. In this model it is relatively simple to elucidate how this property “far from equilibrium� initially happens, because the origin of the system is two halves of a cell (oocyte and spermatozoid) that readily join forming a complete cell. However, where is the real origin of the individual? What is the underlying law that conducts this process? The answer comes by itself: Entropy. The baby represents an intense entropic shortcut to maintain the species.

The Smax of the universe is the final equilibrium, and its Suniverse is at its own boundary. At Smax any system would achieve the maximum equilibrium; therefore, asymmetry vanishes and symmetry arises, but with an established symmetry information stops, then the system ceases as such. In biology, maybe this corresponds to aging process until death. After death a biological systems tends to become again chaotic and ultimately dissipates into the entropy of the environment, and subsequently merges with the Suniverse (its “new� attractor). In cosmology massive stars after dying are shrunk into black holes (25). In both cases, for death systems and black-hole converted systems the fate is merging with the entropy of the universe, which mean re-entering in “the game� breaking the symmetry, and asymmetry arises again. Meaning that the Smax of the universe is that for an internal observer, and the Suniverse of the universe is that for an external observer, and perhaps ultimately both are the same. The principle of “0� and “1�, neither can exists without the other. Collier has hypothesized biological information based on hierarchical levels of information depending on the molecular complexity; however, this author claims that there is ambiguity between syntactic information (understood as the information encoded by the parts) and functional information (15). Perhaps the answer to this “gap� is the asymmetric character of Entropy.

Observer’s position

The functionality of biological systems is reliant on Smax and Suniverse, therefore maximum efficiency is comparable to an actual Entropy level of the organism, and this Entropy level identifies the system itself at any present instant. As an ultimate argument it may be possible that no system is equal to another and not even to itself at any moment. This conjecture is rooted in the Heisenberg’s Uncertainty Principle. Nonetheless, at least, macroscopically systems apparently maintain certain integrity and identity through a dynamic interaction with the media, which means that they are constantly constraining disturbances. At this point we can infer again a relation with Ashby’s Law of Requisite Variety, in the sense that systems can overcome a disturbance of the media only if they have sufficient flexibility or variety to counteract such disturbance. Meanwhile, the internal tendency towards Smax keeps enough internal variety inside the systems, and therefore maximizes their capacity to beat the disturbances of the media. However, the internal Smax is always less than the external Suniverse, and this status is achieved through controllers. The controllers must be replicas or “aliquots� of the system (26), which are able to detect variations of the actual internal entropy related to the Smax of the system. Suniverse minus Smax may be a constant, and perhaps similar inside same species or genres, and even possibly proportional among living organism and non-biological system. Dewar has proposed a similar theory using Jaynes’ principle and probabilistic thermodynamics (27). Therefore, concepts such as “minimum entropy production� and “maximum efficiency� can be redefined through these principles. In fact, the separation of “physiological time� from “metabolism rate� (14), and the apparently impossibility of finding a link between both, is because of this preconceived division between structure and function. Ultimately, elements and function of the systems are so intimately interrelated that they are the same. Similarly, matter and energy have common identity through Special Relativity, so do particles and waves in Quantum Mechanics. Observer’s constrains and setting of degrees of freedom may also hamper the understanding that no-separation is an inherent condition. Science has to be extended to a vast array of dimensional spacetime. Separation of structure from function is a flaw that arises from being local and reductionist. The powerful Principle of Least Action and its extensions can be also embedded into Entropy and Information (28-30). Therefore, understanding organisms through thermodynamics/informatics, i.e. Entropy, is the best “natural� way to do so.

In Entropy/Information, if the relative position of the observer (internal or external) is not taken into account then misleading and twisted interpretations are produced in every scenario. Thermodynamically speaking for an external observer the only important quantity is work, which in Information is analogous to data (bits). Contrarily, for an internal observer heat is the most important, because it increases the Entropy and capacity (freedom) to produce such work. Of course, not all heat can be converted into work, and a fraction of the heat indefectibly diffuses to the media. This share of energy lost as heat can be seen as an erasing procedure that reassures that more energy (data input) can enter into the system to generate more work (data output or information). Therefore Entropy is the overall function that brings possibilities and flexibility to systems. From an external observer point of view, and at the beginning of an observation, the Entropy of a system is highest (Smax) and the potential amount of data (information) is also the highest. Conversely, for an internal observer (inside the system) the Entropy and the amount of data (information) is the lowest (Smin), since the system itself is the “normal� status of such observer, and, moreover, the observer is part of the system. Therefore, any observation made by humans is always contaminated by bias.

Particularly in Many Worlds Interpretation (MWI) (Everrett’s theory) of Quantum Mechanics, again it arises the argument of the position of the observer. As a matter of fact, Tegmark says, “Everett’s brilliant insight was that the MWI does explain why we perceive randomness even though the Schrödinger’s equation itself is completely causal. To avoid linguistic confusion, it is crucial that we distinguish between
• the outside view of the world (the way mathematical thinks of it, i.e., as an evolving wavefunction), and
• the inside view, the way it is perceived from the subjective frog perspective of an observer in it.� …
�It is in this sense that the MWI predicts apparent randomness from the inside view while maintaining strict causality from the outside view-point…�The reader must choose between two tenable but diametrically opposite paradigms regarding physical reality and the status of mathematics:
• PARADIGM 1: The outside view (the mathematical structure) is physically real, and the inside view and all the human language we use to describe it is merely a useful approximation for describing our subjective perceptions�.
• PARADIGM 2: The subjective perceived inside view is physically real, and the outside view and all its mathematical language is merely a useful approximation.� (31)

Bear in mind that Tegmark’s inside and outside views are, to say, from a conceptual quantum perspective. Instead, and compounding the issue, ours internal and external observers are from a broad thermodynamics/informatics perspective, not only conceptual, but also active, as the observers are actively performing measurements. Of course, if such observers were place at a quantum scale, then we would face similar Tegmark’s dissertation. We prefer Tegmark’s Paradigm 1, as it is more precise and likely not interfered by semantics (or semiotics). Since Entropy/Information, according to Bekenstein, has a physical location, then it is more likely that physical reality evolves from Entropy/Information, and be explained by theories “living� at the locus of Entropy/Information. Bekenstein noticed this similarity and proposed the identity of the area of the event horizon with entropy, in 1972. Bekenstein’s insight is a milestone in theoretical physics. Additionally to the interconversion matter-energy by Special Relativity, both matter and energy can be then expressed in another state such is Entropy (32)

We think that well-known paradoxes (i.e.quantum suicide, quantum holocaust, quantum immortality, etc), arising from MWI as an extension of the Schrödinger’s cat, are consequence of the lack of consideration of the position of the observers, including the one (like us) who is reading or thinking about these paradoxes. Briefly, overall, the setting of such paradoxes is that a “predominant quantum system� (or mechanism) is confronted with “predominant macroscopic systems�, and, moreover, the outcome is realized at a macroscopic scale (by human mind). This assertion does not dichotomize what is “quantum� from “macroscopic�, because we are conscious that the macroscopic realm arises from an underlying quantum realm. The “abracadabra� of these paradoxes is when different scale measurements are mixed, this blurrs (as some sort of smoke curtain) the awareness of the external observer (whoever is doing the experiment or even reading the paradox itself). Basically, the “macroscopic realm� has much less degrees of freedom compared with those of the “predominant quantum realm�. Conversely, once a quantum event is placed and observed into the “macroscopic realm� the degrees of freedom of such quantum event are “reduced�, as they are aligned with those of the “macroscopic realm�. Therefore the uncertainty of the quantum event is apparently also reduced. Another “abracadabra� is that the “predominant macroscopic� and “the predominantly quantum� system are apparently treated as completely isolated, for instance by placing the Schrödinger’s cat inside of a box, it gives the impression that they do not interact with the outside world (stressing that there is no interaction with the observer situated inside the experiment itself, but outside the box). In the Schrödinger’s cat situation the poor animal is death and alive at once, as the result of some sort of “propagation� of uncertainty coming from the “predominantly quantum system� (we do not see how this is possible!). In the case of the “quantum suicide� there is not only such a “propagation� of uncertainty from the quantum system, but also, in the context of the MWI, the “predominantly macroscopic system� (i.e. the suicide-prospective scientist) acquires uncertainty, which keeps him forever alive (unable to die), while his assistance witnesses the scientist death. In these though experiments there is another catch, even if there is no apparently observation of the systems in question, this does not means that the uncertainty is not already reduced to match the permissible degrees of freedom of the “macroscopic realm�. “Even if the many worlds interpretation is correct, the measure (given in MWI by the squared norm of the wavefunction) of the surviving copies of the physicist will decrease by 50% with each run of the experiment. This is equivalent to a single-world situation in which one starts off many copies of the physicist, and the number of surviving copies is decreased by 50% with each run. Therefore, the quantum nature of the experiment provides no benefit to the physicist; in terms of his life expectancy or rational decision making, or even in terms of his trying to decide whether the many-worlds interpretation is correct, the many-worlds interpretation gives results that are the same as that of single-world interpretation� (33). This means that same results are obtained using the unitary collapse of wave Copenhagen interpretation and the “non-collapsed� MWI. In fact Tegmark says,
• “What Everett does NOT postulate:
At certain magic instances, the world undergoes some sort of metaphysical “split� into two branches that subsequently never interact.�
In essence, what Everett says is that there are superpositions of possible outcomes, and inside each one the observer sees only his respective outcome. The same author continues, “According to the MWI, there is, was and always will be only one wavefunction, and only decoherence calculations, not postulates, can tell us when it is a good approximation to treat two terms as non-interacting…When confronted with experimental questions, adherents of the first four [Copenhagen, Many worlds, Bohm, Consistent histories] will all agree on the following cookbook prescription for how to compute the right answer, which will term the ‘shut-up-and calculate’ recipe:
Use the Schrödinger equation in all your calculations. To compute the probability for what you personally will perceive in the end, simple convert to probabilities in the traditional way at the instant when you become mentally aware of the outcome. In practice you can convert to probabilities much earlier, as soon as the superposition becomes ‘macroscopic’, and you can determine when this occurs by a standard decoherence calculation� (34). The same author, in another publication, says, “Everett’s viewpoint become known as the ‘many worlds’ or, perhaps more appropriately, ‘many minds’ interpretation of quantum mechanics because each of one’s superposed mental states perceives its own world. This viewpoint simplifies the underlying theory by removing the collapse postulate, implying that there is no new undiscovered physics that makes these superposition go away…Even though the wave function technically never collapses in the Everett view, it is generally agreed that decoherence produces an effect that looks like a collapse and smells like a collapse.� (35). Decoherence was first demonstrated by Zeh in 1970. Briefly, it consist on the own “measurement� or “observation� or “scanning� that elements of an environment makes on the objects, as a result of such interaction objects are turn on reality. To say, for example a pen on a desk comes to reality or materializes because is scanned by the environment. The environment is technically the rest of the universe. In the Schrödinger’s cat experiment, therefore, because of decoherence we do not expect that the cat or the gun or the box would split into other worlds just because they are in proximity to a “predominant quantum system�. Despite Tegmark postulate the end result would be the same using the collapse-wave theory (Copenhagen) or the “non-collapse� MWI theory, according to Page there might be differences in different cosmological scenarios with different number of observers, which means different realities (36). This might be possible if our universe is one of multiple waves, and just one of them is what we call the Schrödinger’s equation. The question then is where and how these damned equations appears?

Tegmark’s paradigms should not be arbitrarily mixed; each one has their own scope. Consequently, we consider that applying the so-called “Principal Principle� (37) to analyzed “quantum suicide� (38) does not have too much sense. We consider that this “principle� is subjective, semiotic, anthropomorphic, and it is a form of counterfactual internalism. Despite Everett’s theory has been also called “Many Minds Interpretation�, we think this designation can be misused with an anthropomorphic connotation. However, nothing prevents from the possibility of worlds or universes without conscious minds. Indeed, decoherence happens without the presence of brains. Actually, the “minds� part of this theory may produce misinterpretations at the macroscopic level. Thus, according to Papineau “Peter J. Lewis seems to have no good reason for ascribing the death-defying implication to the many minds theory� (39)

Health and disease

If a biological system at any instant matches its perfect Smax it collapses. This makes plausible that the biological system encounters a state of total equilibrium of all its elements, at all degree of freedoms, and of course, instantaneously. Under this hypothetical status, at Smax, the biological system halts because no more events are possible. This means that the biological system is unable to produce useful energy, thus it cannot produce work so it cannot produce more information. In Chaos Theory terms, the biological system falls into its attractor (which is Smax), and in turn, it falls to the “infinitesimal infinite� or zero (40, 41). We are not quite aware of any known pathological condition associated with perfect equilibrium at Smax. No research has been done on this particular issue. Smax-like conditions may occur at ultrastructural levels, or as “hyperfunctional� levels. Disseminated intravascular coagulation may follow such thermodynamic mechanism; in fact at some point, most or all coagulation factors can be interacting at a Smax-like status. Atherosclerosis might be also some kind of Smax condition. Well, if a biological system reaches a deleterious Smax status, what is next? Since it cannot sustain its integrity and its function any more, it would be absorbed by the Suniverse. However, pathological conditions may primarily follow the opposite thermodynamic route, which is the tendency to merge with the Suniverse. We can say that diseases start as by increasing Entropy at ultrastructural sub-cellular level in an organ. For instance cancer and autoimmune diseases may follow this entropic pattern. The increasing Entropy disseminates into a cell, then from cell to cell, then to the whole organ, then to the system, and finally to the whole body. Clinical manifestations such as pain, fever, malaise, etc. may be associated with increasing Entropy. When the entire biological system reaches certain high Entropy level, being unable to sustain function and integrity, it dies. Another possibility is that pathological conditions represent alternating phases of high/low Entropy, resembling different directions in relation to space state attractors (different Lyapunov exponents) (40). The immune system can be better understood through Chaos Theory, because it is complex, self-assembled, interactive, fractal, evolutive, and capable of memory. There is a balance between Th1 and Th2 immune responses, thus the immune system can be conceptualized as a self-referential network revolving “strange attractors�, which in turn are very important in tumorigenesis and cancer treatment (42). Brú, et al, have found that malignant tumors show a clustering fractal dimensional growth at their contour, which renders this microenvironment more acidic. Space competence with the host is apparently the main factor in the growth of tumors. Brú’s results contradict the classical oncology tumoral kinetics, and, moreover, challenge the foundations of chemotherapy and radiotherapy; besides, he conjectured that neutrophils by resistance to acidosis and competing for space may play an important role to inhibit tumoral expansion (43, 44). In fact, the same group has reported the cure of a terminal hepatocellullar carcinoma by inducing patient’s neutrophilia with granulocyte colony-stimulating factor (45).
Understanding boundaries may be extremely important, as all systems follow the holographic principle as the expression of the Information (Entropy) “living� at boundaries (46 - 49). Goldberger, by the analysis of heart rate plotted against time has conceptualized a similar model of heart disease. Healthy hearts exhibit rates of nonlinear dynamics and possible chaotic behavior, which reflects the physiological flexibility and adaptability. Contrarily, in heart disease rates became periodic or totally random (50). Possibly the periodic rate indicates abnormal tendency to Smax, in the sense that the functional pathways are being reduced by an equilibrium imposed by the generic attractor of the system.

Life and evolution

Looking for “the cause of death� in an autopsy appears a bit odd as thermodynamically speaking death is a whole process, besides the entire body dies. Death is a dynamical process, therefore cannot be localized. The conjecture here is why doctors, then, do not look for “the cause of life�? Seriously, it will be more beneficial.

The basis for what is life is at the core of the “struggle� between the Smax and Suniverse inside the biological system. Entropy, as a universal principle with different aspects, renders integrity and functionality to all systems, including biological systems. Integrity and functionality are sides of the same coin, ultimately. This approach has many points in common with the universal law “Tao� of Lao-Tzu (400 B.C.) (51). The Yang and the Ying are comparable to Smax and Suniverse, and similarly to the Second Law of Thermodynamics and Entropy, this can be extended to the whole universe. “Ying exists within Yang; Yang exist within Ying�, it is an expression that implies binarity and, therefore, it is related to the principle of 0 and 1. Lao-Tzu in the opening of his book “Tao Te Jing� says “The Tao that can be told is not the eternal Tao, The name that can be named is not the eternal name�, but we assume that this uncertainty of Tao depends on its extreme complexity. Entropy might be dual with the “eternal� Tao. Similarly, at dimensional extremes Entropy also falls into uncertainty. According to Schrödinger’s famous book “What is life� (52), life basically covers two aspects “order to order� and “disorder to order�. “Order to order� is akin to reproduction or replication of the organism. In biological systems usually this is through DNA, which is essentially a macromolecule (polymer) with high binary code capacity. The immense probability of base combinations makes DNA extremely uncertain, therefore likely having high Shannon’s entropy. Knowing the sequences of DNA may have reduced almost insignificantly such uncertainty, since the informational power of this polymer depends vastly more on its functional expressions than that of its sequence. “Disorder into order�, this means that biological systems can exchange “energy� and matter with the media. In animals the source is basically matter exchange. We uptake matter with relatively low Entropy, and subsequently the programmed enzymatic metabolism reduces even more the Entropy of this uptake. Metaphorically, animals bite the bits of vegetables, in the sense that animals nourish with the information taken from the vegetables. Salthe has evoked a similar concept of transference-degradation of energy between “gradients and consumers�, where the consumers are also “gradients� (53). As an extension of this hypothesis, he has proposed that users of energy do not increase, locally, the overall Entropy of the system; however, they do shape the distribution of Entropy. In this context, Evolution can be conceptualized as introduction of new levels of consumers to optimize the management of Entropy, which ultimately restrain the tendency for dissipation. Salthe, metaphorically, concludes, “Evolution, then, is the Universe’s devious route to its own negation� (54). “The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production…The formulations from classical thermodynamics can be applied to non-equilibrium systems which are not isolated (e.g., Prigogine 1962)… For these systems, the Second law then takes the form of a continuity equation, in which the overall change of entropy of the system dS/dt is determined from the local increase in entropy within the system dSI/dt and the entropy flux convergence dSE/dt (i.e., the net flux of entropy across the system boundary): dS/dt = dSI/dt + dSE/dt…A non-equilibrium system can maintain a state of low entropy by “discarding� high entropy fluxes out of the system� (55). A complicated mathematical and statistical form of non-equilibrium Entropy has been proposed (56).

Production of vitamin D, reduction of melanin in skin, and reduction of retinoic acid in retina are few of the reactions in animals that involve direct usage of solar radiation. Contrarily, in vegetables the solar radiation is the main substrate source through the process of photosynthesis. Another flaw of anthropomorphism is that DNA is the starting source or origin of life. Instead, DNA is just another biological polymer, and as any other polymer in nature it also has Entropy, and, therefore carries information. Besides, Schrödinger’s dichotomization of “what is life� is arbitrary, since DNA is a molecule produced by the biological metabolism. Therefore, ultimately life can be summarized as just “disorder to order�, and reproduction is an entropic shortcut in this process to reassure preservation of the species. It should be clear that in thermodynamics and informatics, terms such “order� and “disorder� must be avoided, because this entails anthropomorphic biases (1). Therefore the question as to “what is life?� can be answer as maintaining lower Entropy than that of the environment (Suniverse), and higher entropy than that of Smax, by exchanging matter and/or “energy� with the environment. Aging process can be thermodynamically explained as a progressively increase Entropy (“error accumulation), and, therefore dissipation of the system. Interestingly, Gladyshev has formulated a similar model using concepts of Gibbs and Helmholtz functions, which are placed in the context of the so-called “law of temporal hierarchies� (Gladyshev’s law) and the so-called “principle of the stabilization of chemical substances� (Gladyshev’s principle) (57). This model is essentially dual with the Smax/Suniverse presented here, since quantities such as internal energy, enthalpy, Gibbs free energy and Helmholtz free energy (“thermodynamic potentials�), as well as Entropy can be obtained based on statistics arguments. Briefly, in these terms Entropy can be expressed as
S = U – F/T, where U is internal energy, F is Helmholtz free energy, and T the absolute temperature, which means, that when F is minimized then S is maximized. Entropy can be also expressed as S = (U + PV) - G/T, where U is internal energy, P is pressure, V is volume, G is Gibbs free energy, and T is the absolute temperature; additionally, PV is work (W) , and (U + PV) is enthalpy. Similarly, this means that when G is minimized then S is correspondingly maximized. According to Gladyshev’s model for biological systems G and F are minimized through a hierarchic gradient sequence of environment/system, which apparently is intimately related with the “law of temporal hierarchies�. Specifically, this law establishes that “a biological system consist of the given organism’s cells, the organism itself, and the population formed by these organisms (i.e., fragment of the hierarchic sequence of biological structures). Identifying the average life-span (life time) of structures makes it possible to assert that the average life-span (t) of a cell (cel) in the organism is much less than the average life span of the organism (org), which, in its turn, is much less than the life-span of the population (pop):
< This assertion is a natural fact. For instance, in the intestine, of course, a cell lives less than villi, the villi lives less than the intestine itself, the intestine less than the animal, the animal less than all the animals, all the animals less than the whole ecosystem, the whole ecosystem less than the planet, the planet less than the galaxy, the galaxy less than the universe, etc. Therefore, by itself this “law� apparently does not have too many consequences if it is not place into the context of a higher physical law or principle. This apparently time hierarchy is perhaps the consequence of not accounting a relativistic time frame. Therefore, this may be indirectly an anthropomorphic bias, because this assessment is done through an apparently external observer only. Moreover, these “temporary hierarchies� may vanish if they are considered as eigen (physiological) times, which together with the metabolic rate can be unified using a broad concept of Entropy, which combines the external and internal observations. The tendency towards minimization of G and F, in biological system, can be part of the universal tendency of systems towards Smax. Thus, this so-called “law of temporal hierarchies� is very likely a consequence of Smax. Regarding the so-called “principle of the stabilization of chemical substances� (Gladyshev’s principle) it can be deduced that the G of simple molecules such as H2, N2, O2, CO2 and H2O is much more compared to the G of macromolecules. In other words, macromolecules (polymers) have more Entropy than that of simple molecules, but correspondingly less Entropy compared to that of the environment (rest of the universe). This assertion is another consequence of the Smax/Suniverse model presented here. Subsequently, because macromolecules are more uncertain for an external observer they carry more information than simple molecules. Igamberdiev’s concept of “internal quantum sate� (IQS) is essentially similar to state phase invariant(s). IQS behaves as cellular automata, and it is “concatenated within the 3D space as a molecular computer (MC).� Enzymes are MCs operators, while error corrections are effected by RNA (short term) and DNA (long term). The error corrections do not affect the IQS (58). In the context of Smax/Suniverse model, IQS can be the thermodynamic status that depends on the instant interactions of all invariants. These invariants interaction is in turn focused on fulfilling constrains imposed by the media. Besides, such media can be internally related to Smax, and externally related to Suniverse.

Living organism by evolution are thermodynamically open (to matter and energy) systems far from equilibrium. Otherwise life would not be possible. Here, the key is “far from equilibrium� to explain life. How living organisms acquired such property? Is this exclusively of living organism? By answering the second question the first is also solved. Chemical reactions naturally can occur “far from equilibrium�. Three to four billions years ago possibly a crack in the clay at the beach of the primitive sea, that eventually sealed and concentrated raw chemical elements, which in turn underwent innumerable chaotic (Brownian movements) probabilities, can explain this property of life. Therefore, when the odds of a natural phenomenon or event (i.e. a macromolecule or biological polymer such a protein or nucleic acid) are statistically or probabilistically analyzed, a reductionist analysis is totally naive. This means just breaking down the biological polymer into the number and different types of moieties, followed just by a conventional probabilistic calculation of the odds of reassembling it exactly as the initial macromolecule. This is not only terribly primitive, but also misleading. It is like breaking down a cathedral into bricks, counting the bricks, and making a statistical calculation how the bricks can be reassembled by themselves into a cathedral again. Obviously, this approach is wrong. But there is also another “little� difference, a cathedral is usually built in few years, instead a natural event had already taken about four billions years to happen, and, moreover, with the advantage that energy constantly flowed through the system. Besides, systems cannot be built from nothing. Nothing can be completely placed into a definition. After all, when a pizza can be “defined� as a pizza? By the recipe, by the ingredients solely, by the ingredients together, before introducing into the oven, during the cooking, when is out of the oven, or while is being eaten or later? As the case of a pizza, every natural phenomenon implies a continuum process. The term “life� does not have a definition, and similarly to time and energy it is also a formality. Any definition of life, unfortunately, falls into the anthropomorphic semantic rhetoric as a “property� of living organisms…and living organisms live as they posses this “property� of life; however as such this “property� remains obscure. On this regard we partially agree with Nasif. However, this author involuntarily also falls into this anthropomorphic trap, because he tries to push a definition of life, which can be considered as a reductionist attempt. Dynamical complex systems are irreducible and impredicative, therefore defined terms cannot be clearly differentiated from defining terms. The term life is just provisional, as a summarized conceptualization of many complex functions. Contrarily, I conceptualized but not define life. Nasif appeals a great deal to energy disregarding the concept of Entropy, and also dichotomizes considerably energy and matter. Besides, for this author the quality of life resides at the cytoplasm of cells related to chemical reductions. We consider life as a whole process that cannot be related to any particular locus, molecule or process, otherwise it will fall again into anthropomorphic terms. In this context the same author says, “… complex self-replication pattern can occur in inert self-replicating templates, for example in prions, viruses, autocatalytic proteins and ribozymes, and all of these structures cannot be considered as living dissipative thermodynamic systems because the postmanipulation of energy for self-replication occurs spontaneously, not autonomously as it occurs in living dissipative thermodynamic systems� (59). What does he mean by “spontaneously� and “autonomously�? If we could ask a virus, a prion, or even a peptide if it is dead or alive, we are sure that any of them will answer that they are alive, and, in fact, from their own perspective (as internal observers) we may be inert. A cell, an organ, and even a whole plant or animal, including human species are not autonomous. Moreover, nothing can occur “spontaneously�. Olalde has proposed another complex model of life and extended to health status. According to this researcher’s philosophy life is a function of the triad “intelligence�, “energy�, and “organization�, while health is the “survival potential� as the result of the interaction of this hypothetical triad (60). In this theory, however, the core of this triad remains elusive. As per personal communication with Olalde, he also thinks that Entropy plays a pivotal role in health and disease. Perhaps “intelligence� and “organization�, in this model, could be expressed in terms of Entropy/Information, and may be explained by coding/decoding-forwards/backwards algorithms (similar to artificial intelligence). On the other hand, energy can be carried by Entropy, and, therefore it can be cancelled. Igamberdiev`s model of living systems has many similarities to our Smax/Suniverse model, as both models establish a relationship with thermodynamics, chaos theory, evolution and self-organization. Admittedly, Igamberdiev`s model is mathematically much more elaborated. Moreover, this model elegantly explains evolution as memory kept inside of a reflective mathematical loop and series. Then, this author emphasizes the intricate interrelation of DNA with structure and function. “The reflecting control in genome is realized by tools (molecular addresses) organizing combinatorial events. Thus, the molecular addresses establish the set of rules for language game corresponding to such hierarchical organization…During this strategy the error-correction is realized, and this takes place in the potential field. We can suppose that the whole organism possesses the ability to forecast the splicing result before is actualized, i.e. it can realize error-correction in the potential field by eliminating wrong potential possibilities, by implicating error-correcting codes. This means that living systems realize computation at quantum level, the process maintaining their dynamic stability at the macroscopic time level� (58). Huang et al had studied about 12600 genome expressions in the cellular differentiation process of neutrophils, and by means of dynamical approach they have found that a subset of 2773 genes converge in state space to a stable attractor associated with phenotype. According to these researches, “Thus formal network architecture considerations as experimental observations of cell fate behavior also support the idea that the genome-scale regulatory network can act as an integrated entity and give rise to coherent, higher-order dynamic patterns, such as stable high-dimensional attractors� (61). Igamberdiev`s conclusion is fascinating, since it can be related to quantum entanglement and “instant communication� (62 - 64). An EPR (Einstein-Podolsky-Rosen)-pair and entanglement mechanism must occur inside biological systems, reassuring their stability, and consequently involved in health status. Another possible aspect of disease could be an imbalance between the physical state of the biological system and the “macroscopic time level�, to be discussed in a future article. Melkikh has proposed a quantum model computation of biological evolution (21), which despite some criticisms done before, is very similar to the core of Igamberdiev’s and our models, as he uses quantum mechanics, as well as reduction of Entropy through a “parametric or force control� operating as a “quantum demon�. However, this author does not further elaborate the emergence of such “parametric or force control�, which according to our model it is most likely equivalent to intrinsic or eigen Smax of the system. In addition, we explain in the context of Entropy/Information and Chaos how this Smax may arise.

In summary, biological systems can be placed into the Universal Entropy/Information. Biological systems can be understood, in this context, by means of well-established concepts such as Smax (Jayne’s Maximal Entropy Principle), Smin (based on Least Action Principle), which are relativistic according to the observer’s position (internal or external). Function and structure are dual, and eventually they merge. In the case of Biological Systems these principles can explain the overall role of metabolism. Terms such as life, death, evolution, health and disease, then acquire a profound and dynamical significance when are framed by Entropy/Information.

Acknowledgements
The authors are indebted to Mrs. María Olinda Tello-Rodriguez for her invaluable secretarial support.

References

1. Brissaud J-B. The meanings of entropy. Entropy. 2005, 7, 68-96.
http://www.mdpi.org/entropy/papers/e7010068.pdf

2. Wikipedia. Karl Popper. 2006. http://es.wikipedia.org/wiki/Karl_Popper

3. Kay JJ. The relation between information theory and thermodynamics: the second law of thermodynamics revisited. Chapter 6; Self-Organization in Living Systems. Ph.D. Thesis. System Design Engineering. Kay, J.J.: University of Waterloo, Ontario; Kay, J.J. 1984; 458 pp. http://www.nesh.ca/jameskay/www.fes.uwaterloo.ca/u/jjkay/pubs/thesis/6.pdf

4. Barone SR, Kunhardt EE, Bentson J, and Syljuasen A. Newtonian Chaos + Heisenberg Uncertainty = macroscopic indeterminacy. American Journal of Physics 1993; 61, No. 5.

5. Ashby WR. An Introduction yo Cybernetics, Chapman & Hall, London, 1956. Internet, 1999, http://pcp.vub.ac.be/books/IntroCyb.pdf

6. Cohen L. Yoga may ease discomfort of Radiation treatment. ASCO 2006 Atlanta, Abstract 8505, (randomized trial). MD Andersen Hospital, Texas.

7. Prigogine I, Stengers I. Order Out of Chaos, Man’s Dialogue with Nature, Bantam Books, New York, 1984

8. Jaynes ET. Foundations of Probability Theory and Statistical Mecahnics. Delaware Seminar in the Foundations of Physics. Bunge, M: Berlin, Springer-Verlag, 1967; pp 77-101

9. MaxEnt thermodynamics. Wikipedia 2006. http://en.wikipedia.org/wiki/MaxEnt_thermodynamics

10. Van Campenhout JM, Cover TM. Maximum Entropy and Conditional Probability. IEEE Transactions on Information Theory 1981;4:IT-27. http://arxiv.org/PS_cache/math/pdf/0601/0601048.pdf

11. Penfield Jr. P. Chapter 9. Principle of Maximum Entropy: Simple Form. Version 1.0.2. 2003. MIT. http://www-mtl.mit.edu/Courses/6.050/notes/chapter9.pdf

12. Penfield Jr P. Chapter 8. Principle of Maximum Entropy: Simple F

tullio
tullio
Joined: 22 Jan 05
Posts: 2118
Credit: 61407735
RAC: 0

RE: Dear Sir thanks a lot,

Message 53242 in response to message 53240

Quote:

Dear Sir thanks a lot, interest has been created please follow through if I can place the whole article for readers to analyze and strt the discussion.


Rather that putting such a long article in a message board, it would have been much better to put a Link to that article. If everybody started doing like you, the message board would overflow with long texts, with the danger of a system administrator canceling them!
Tullio

gravitonring
gravitonring
Joined: 19 Oct 06
Posts: 170
Credit: 8508
RAC: 0

never the less, the article

never the less, the article helps me, as a dummy, to clearly understand the topic suggested and is most detailed and easy to understand even for my simple mind :) uncertainty is a key to every detailed analysis, and to any world view, and IMHO cannot be minimized nor sacrificed on the altar of any dogma, without destroying credibility, and individuality...

and so far i have only looked at the entropy description in the article :)

the first time i recall ever seeing a clear picture of the topic myself !!

OK halfway through Biological Systems and i am still very thankful for the article, as a dummy observer, it seems i have always had more of an effect by absorbing information, rather than exuding it...a wonderful feeling speaking as a biological system, and even as a somewhat transcendant nonphysical mind :)

done with Biological Systems section of the message...since i totally gave up on a bottom up approach after seeing Prigogine's UTEP three billiard ball contact illustration, representing a nearly infinite set of probable outcomes,
i seriously appreciate the universal leap into Smax...where i apparently will be going within the next few decades :) unlike Houdini i will not forget younz guys who are still working at less than Smax, and i will send back some info :)

everything is true, the opposite of everything is also true

gravitonring
gravitonring
Joined: 19 Oct 06
Posts: 170
Credit: 8508
RAC: 0

ok i think i might have my

Message 53244 in response to message 53243

ok i think i might have my first question...whatever happened to the so called 'chaperone effect' supposedly super imposing some mysterious folding ability to DNA...

i have always wondered, since i was nursed as a college freshman right out of high school, by Linus Pauling's text on atoms and molecules, WHY ARE WE NOT AT LEAST AS FASCINATED by the effect of a single atom, and why did no one answer or maybe even ask why the intelligent super imposing mysterious ability might apply to entangled particles or even some ethereal boson, i mean why anything happens that is at any level of information, is a fundamental question, without apparently any answer :)

oh yes as to why i am CERTAIN that i will send back information after my biological death: donate dead body here
or also donate dead body here
is why, i have pledged my entire body to it, after any harvestable organs, and or, possibly being melted down and placed into a billion test tubes for the sake of scientific research;

everything is true, the opposite of everything is also true

gravitonring
gravitonring
Joined: 19 Oct 06
Posts: 170
Credit: 8508
RAC: 0

OK i think that is about all

OK i think that is about all as a dummy i can absorb for the moment from the article...i still have my 36 year old question: HOW DID MY DAUGHTER happen to exist as an age old spiritual mother to myself? even as a fetus, playing a game of touch this bulge in my mother's abdomen, before i pull back my elbow or foot, and you win a prize :)

maybe there are some mysteries which should be simply enjoyed as mysteries :)

everything is true, the opposite of everything is also true

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.