The link between Consciousness and Entropy
science·@technovedanta·
0.000 HBDThe link between Consciousness and Entropy
<html> <p><img src="http://www.nature.com/article-assets/npg/srep/2013/131009/srep02853/images_hires/m685/srep02853-f4.jpg" width="384" height="685"/></p> <p><a href="http://journals.aps.org/pre/pdf/10.1103/PhysRevE.94.052402">French and Canadian researchers</a> have recently found that normal waking consciousness is accompanied by a maximisation of information content in the brain, as evidenced by different encephalography techniques. They say that wakeful states involve greatest number of possible configurations of interactions between brain networks, representing highest entropy values. (The higher the entropy, the less ordered a system is). And interestingly they couple this to Tononi's Theory of Integrated Information by stating that this maximisation of information leads to optimal segregation and integration of information.</p> <h2>Background</h2> <p>This is highly interesting to me, because as you might recall from my earlier post on a "The God problem" I wrote about this relation between consciousness and entropy:</p> <p>Quote:</p> <p>Consciousness has therefore a strong relationship with pattern, with structure and it may well be that if we find the right formula to quantise consciousness quality as negentropy and weigh this against the heat dissipation created in a conscious creative process, that the ultimate outcome is that the heat dissipation entropy is more than compensated for by the gain in consciousness quality. And perhaps Weber Fechner’s law is the right metaphor for both consciousness quality and pattern emergence. Perhaps the quality of consciousness is indeed reflected in the constant k in the Weber-Fechner formula S= klnA/A0 which should be summed over all the processes and senses involved. Because stimuli corresponding to greater differences can be dealt with by a consciousness with more versatility, with more resistance. Perhaps it is no coincidence that the formula of entropy, Boltzmann’s ΔS=k ln W/W0 is highly isomorphous to that of Weber-Fechner, as its symmetric counterpart in the world of matter. Perhaps the degree of randomness of an event is linked to the degree of order created in a process (without them necessarily being quantitatively equivalent). Because opposites are joined at the hip. </p> <p>And what do we see? The more structure, the more consciousness a species has, the more versatility it has, the more variety it can create, all aiding to maximise heat dissipation – and to build ever further structures until the system at a new meta-level becomes isomorphous again with a previous level far away: Until it becomes an abstractor per se: a nodal network of mindedness, and interconnected dendrogram. And perhaps it is not entropy that causes the attraction resulting in levels of order, but its consciousness counterpart. Perhaps we mistake the rope for a snake. Perhaps the solution to the God problem is that there is no bearded God-as we-know-it, but that there is a fractal of consciousness, of self-repetition in ever increasing variety, ever increasing potential. </p> <p>Unquote.</p> <p>In other words, the above mentioned study confirms my hypothesis. The brain indeed screens a maximum of possibilities, a maximum of information to arrive at a better integration of the information. </p> <h2>Logarithmic compression</h2> <p>But what the authors of the above mentioned article missed, is that the logarithmic component in the Boltzmann and Weber-Fechner formula already implies that consciousness means ordering and abstraction of a maximum of information to an integrated oneness of cognition. This idea I derived from some insights of a programmer called Barry Kumnick (author of the highly interesting blog "Beyond Information"), which I will discuss below.</p> <p>I already discussed this issue in my book Technovedanta and for the sake of this argument I will quote the relevant passage.</p> <p>Quote:</p> <p>This ordering is optimal if organised logarithmically, because the very nature of fractals is their logarithmic repetition. Hence, because the universe is organised in fractal structures, we have evolved to be able to maximally profit from the availability of information of the universe, by becoming isomorph to it. </p> <p>Therefore the optimal reactivity to stimuli from the environment is logarithmically as expressed by the Weber Fechner law. In order to avoid an informational overload it is essential that information from nodes further away cannot reach local functionalities so as to avoid an overload of associations. The higher the intelligence of a system, the more nodality it can support while still giving a meaningful output. At a certain moment the processing speed the system becomes the limiting factor, which demands the system to aggregate with similar systems, allowing for a parallel function distribution in order to maximise the overall utility of the total, which due to specialisation is significantly increased when compared to the non-aggregated level. </p> <p><br></p> <p>Barry Kumnick also said (comments in square brackets by me): (Meta-quote)</p> <p>“To maximise the reuse of shared representation [entropic variegation] and thus minimise storage space [entropic attraction], we should factor out the shared parts of each abstraction's representation and only represent the shared parts one time. The computational structure best suited for intensional factoring is a set of trees.</p> <p><br></p> <p>Hypothesis: The branching topology of dendritic trees is morphologically identical to the branching topology referred to in the previous section. </p> <p><br></p> <p>1) Neuron dendritic trees are a direct biological implementation / instance/ concretum of the upper ontological representation of concept intension.</p> <p>2) Neuron dendritic trees factor the representation of similarities and differences [syndiffeonesis] in the representation of concept intensions. This maximises metabolic energy consumption...</p> <p>Over the neural network as a whole it results in logarithmic combinatorial compression of representation and computation.</p> <p>...</p> <p>7) Concept intensions form our abeyant (i.e., static) representation of thought and knowledge.</p> <p>7.1) From hypothesis 1, a neuron’s dendritic trees represent the concept intension.</p> <p>7.2) A concept’s intension represents and defines the meaning of the concept. [in linear algebra Ker(phi)]</p> <p>7.3) Therefore, a neuron’s dendritic trees represent and define the meaning of a concept.</p> <p>7.4) Therefore, the meaning of a concept is stored in a neuron’s dendritic trees.</p> <p>7.5) A neuron’s dendritic trees exist whether or not they happen to be receiving or processing synaptic inputs.</p> <p>7.6) Therefore, neurons dendritic trees (and their synaptic weights) represent and store memory.</p> <p>7.7) Therefore, neuron dendritic trees and concept intensions form our abeyant (i.e., static) representation of thought and knowledge.</p> <p>7.8) Neuron dendritic trees represent, define, and store the meaning of concepts.</p> <p>Definition: The concept extension represents the existence of all instances of the concept". (Meta-unquote). [In linear algebra Im(phi)]</p> <p>And what does BK say furthermore: (Meta-quote)</p> <p>“I have reduced the dendritic integration process to a couple of recursively coupled linear algebra equations”. (Meta-unquote).</p> <p>This is well possible.</p> <p>I previously said: “My hypothesis is that the very essence of consciousness lies in its abstractive functionality: the reduction to essentials as the feedback upon a stimulus. And as this process appears to be present on any level of existence, existence is at least proto-conscious, if not 1st order conscious throughout all the levels of the universal fractal. The very notions of “incompleteness”, “undecidability” etc. are vital to the process of proto-conscious abstraction. If you abstract and REDUCE to essentials, you limit possibilities, you render the entity less complete; you have taken a decision for a given limitation.”</p> <p>And now I state that this reduction to essentials, this minimisation of storage space is a form of entropic attraction maximising meta-entropic variegation, resulting in the necessarily “logarithmic ordering” of the Chaos.</p> <p>Because the very essence of consciousness is abstraction and integration, which is also the very essence of all phenomenological natural processes, and because the outer world logarithmic compression is similar to the logarithmic compression by the neuronal network, it is a matter of semantics to conclude with the definition that all natural processes are a form of proto-conscious processing. Hence I consider this leads to a strong presumption that all is consciousness. </p> <p>A great <a href="http://arxiv.org/abs/0811.0139">article</a> putting entropy in a wider context defines forces and counterforces as informational confidence intervals in an informational entropy context, from which it can be shown that an autopoietic sustainable solution can be arrived at the value of the golden ratio, giving the shape of a Yin-Yang symbol! Here logarithmicity and the golden ratio, both known for their inherent fractality are joined in a cosmic symphony resulting in ever newer forms of replicas contributing to the maximisation of meta-entropic informational variation and utility. Phi and e are the lesser and greater key of King Solomon; they are the Goetia and Theurgia opening the gates of Heaven and Hell. The sublimation of abstraction and variation, of isotelesis and polytelesis, of adaptation and diversification.</p> <p>Network optimisation, performance and flexibility are achieved when meta-entropy and entropy are maximised. Ergo in networks (meta)-entropy maximisation results in an ordering that warrants maximisation of variation/flexibility. </p> <p>Meta-quote from a scientific<a href="http://arxiv.org/ftp/nlin/papers/0408/0408007.pdf"> article</a>: “...we have shown that for large networks in the asymptotic limit of local performance saturation, the design requirement of reliable performance under maximum uncertainty leads to the emergence of power laws as a consequence of the maximum entropy principle”. That is, under these general conditions, a power law-based organization gives a network the maximum flexibility to perform well overall in a wide variety of operating environments. Note that for a specific operating environment, there may exist some other distributions that can outperform the maximum entropy distribution with respect to the global performance target; however, such a biased network may fail when the underlying environment changes, whereas the maximum entropy distribution-based network will continue to survive and perform. Thus, under entropy maximization, the network’s performance is optimized to accommodate a wide variety of future environments whose nature is unknown, unknowable and hence uncertain." Meta-unquote.</p> <p>Shannon (<a href="http://www.bearcave.com/misl/misl_tech/wavelets/compression/shannon.html">Shannon</a>, C.E. The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.) showed in 1949 that the best way to compress information is logarithmically. Nature follows similar patterns. In order to maximise space filling of a circle (e.g. in the generation of sunflower seeds) it turns out that the "most irrational" number, which corresponds to the golden mean and is the furthest away from simple rational fractions is at the angle of 137,5° resulting in numbers of seeds according to Fibonacci and an approximation of the golden spiral. In other words nature's way to maximise meta-entropy -here in the form of generating as much as possible seeds- and thus potentially maximising evolutionary variegation, results in a higher order arrangement. Galaxies form logarithmic spirals driven by the same entropic attraction. Unquote.</p> <p>Our French and Canadian friends come to the same conclusions: </p> <p>“It has been proposed that aspects of awareness emerge when certain levels of complexity are reached. </p> <p>"It is then possible that the organisation (complexity) needed for consciousness to arise requires the maximum number of configurations that allow for a greater variety of interactions between cell assemblies because this structure leads to optimal segregation and integration of information.”</p> <p><strong>Pruning</strong></p> <p>However, what they call maximisation, must have a certain limit; it must not be an unbridled random generation of all possible permutations, there must be a kind of pruning away of useless and redundant alternatives. In other words, I presume maximisation means maximised within limits, within reasonable boundaries. This is my reasoning (which you can find in chapter 6 of Transcendental Metaphysics):</p> <p>Quote: The ordering potential also comes to expression in the form of adaptability, which entails its capability to generate meta-vari(eg)ation but it also entails Perceptibility. Perceptibility must be a measure of the variety and intensity of signals, which an autopoietic system can process: Perceptibility is therefore a product of the perceptional variation potential and Weber-Fechner's law of Perception. Adaptability arises from the capability to change between strategies and from the capability to generate strategies (creativity). There must be a kind of optimal adaptability, since a system that tries too many changes or screens too elaborately is not effectively adaptive but loses itself in the game of providing alternatives. A highly intelligent system has effective pruning strategies to rapidly discard potentially ineffective strategies.</p> <p>The more alternative pathways are available in a network (such as e.g. a neural network), the more possibilities there are of resonating with some signal from the environment.</p> <p>A hyperconnected network in which all the subconstituents, the nodes, are linked to all other subconstituents only yields meaningful and reasonable strategies, if the links are not all the same. However, by attributing different weights to the links between the nodes, different patterns of resonance can occur. The more patterns can be perceived (i.e. the higher its perceptional variation potential is) and effectively pruned, the more efficient the system. Adaptability is possibly strongly connected with the ability to effectively prune, which ideally is an optimised process. Too zealous pruning may discard potentially helpful strategies; of course this is a looped process, where if the most promising strategies fail, screening of less promising strategies will be attempted.</p> <p>The pruning resulting in a been-there-done-that attitude towards non-effective strategies optimally will favour cooperative harmonising systems. Firstly because that's the Nash equilibrium and in the long term the most advantageous solution to e.g. Axelrod's "Prisoner's Dilemma" (two prisoners can choose to cooperate and both serve a short period of incarceration and not betray each other or cut a deal with the prosecuter, resulting in a shorter incarceration for the betrayer but a longer incarceration for the betrayed prisoner). Secondly because the harmonic resonance resulting from that strategy is the best spreading meme resonance. Axelrod's optimal "Tit for Tat" strategy automatically results in a natural "friendliness" and a forgiving nature of the system. The cooperation is ultimately in servitude to the higher level entity. Unquote.</p> <h2>Diminished Brain activity during enhanced states of perception</h2> <p>Paradoxically, <a href="http://www.pnas.org/content/109/6/2138.abstract">science</a> has also shown that during intense meditational experiences involving enhanced perception or during intense hallucinations on psilocybin, brain activity decreases rather than increases! Yet the degree of consciousness in such states can most certainly not be called diminished. If anything, it is rather enhanced. This is also the argument of the philosopher Bernardo Kastrup, who associates consciousness with decreased entropy.</p> <p>My hypothesis out of this conundrum is the following: The brain functions as a filter. To engage in daily activities it is important that the consciousness is directed outward. For this waking type of consciousness, the filtering ability of the brain is maximised, by allowing the maximum of abstracting structures in the neurons reduce the information to what is essential for survival. This results in consciousness focussed on the outside world, but with a great absence of awareness what is happening on the inside world.</p> <p>On the other hand during meditative and hallucinatory experiences consciousness is turned inward. Here the intense experiences derive from the fact that the filters are maximally switched off!</p> <h2>Conclusion</h2> <p>In other words, consciousness is perhaps always on, but it is recognised as our daily outward directed phenomenal consciousness when the brains filtering mechanisms function optimally, resulting in the observed maximisation of informational possibilities within boundaries (entropic maximisation). In the inward directed experience, consciousness may be directed toward itself leading to a minimisation of informational possibilities, resulting in the feeling of oneness with everything.</p> <p>The idea that consciousness can be completely switched off as in coma or deep sleep, is a hypothesis which is more and more disputable. This <a href="http://www.sciencealert.com/your-consciousness-does-not-switch-off-during-a-dreamless-sleep-say-scientists">article </a>argues that consciousness does not switch off during deep sleep.</p> <p>I am curious to see if these notions will one day give us a complete control and merger with reality, which I also hypothesise to be a neural network at its more foundational quantum levels.</p> <p>Image from http://www.nature.com/articles/srep02853</p> <p>By Technovedanta a.k.a. Antonin Tuynman. If you liked this post, please upvote or resteem. You can also read my other books: Transcendental metaphysics <a href="http://www.lulu.com/shop/antonin-tuynman/transcendental-metaphysics/ebook/product-22990796.html">e-book</a> and <a href="http://www.lulu.com/shop/antonin-tuynman/transcendental-metaphysics/paperback/product-22990789.html">paperback</a> and Technovedanta <a href="http://www.lulu.com/shop/antonin-tuynman/technovedanta/ebook/product-22857147.html">e-book</a> and <a href="http://www.lulu.com/shop/antonin-tuynman/technovedanta/paperback/product-22857134.html">paperback</a>.</p> </html>
👍 technovedanta, jillstein2016, virtualgrowth, thegame, steembets, steemland.com, steemprentice, jackkang, billykeed, superstar, riskdebonair, svamiva, cryptoctopus, angel76, murh, bue-witness, bue, mini, healthcare, boy, bunny, daniel.pan, moon, helen.tan, craigslist, cardboard, speda, kimmar, remlaps1, mithrilweed, ninkhisibir, wang, deanliu, darth-azrael, remlaps, voodoolizard, fyrst-witness, fyrstikken, anyx, xer, steemspeak, wackou, brich, pkattera, ullikume, teukumukhlis, chiliec, vcelier, codydeeds, ats-david, shortcut, bryan-imhoff, transhuman, spottyproduction, lemouth, themartian, cestlavie, goldsteem, blackchen, kooshikoo, paxmagnus, meerkat, penambang, pangur-ban, schro, bacchist, steem1653, greymass, d3nv3r, neptun, ballinconscious, surpassinggoogle, supergoodliving, steemperor, damiendecoster, tamersameeh, tombstone, steem-id, jchch, ilanaakoundi, hisnameisolllie, tee-em, bkkshadow, cmp2020, steemradio, littlescribe, bestoftherest, fosho, writingamigo, thylbom, burnin, steempire, reisman, gamer00, proctologic, richardcrill, prufarchy, sonzweil, netaterra, gutzofter, riverhead, bitland, josehurtado, sabinson6, l0k1, grandpere, jonathanyoung, justtryme90, alt1001, steemstem, eric-boucher, ryanmerchant, craigwilliamz, ashleywilliamz, mr-liquidity, felix.van.driem, pearlcore, kcnickerson, hani0inah, iamchristopher,