Heat Death

The second law of thermodynamics tells us that entropy always increases: even if some arrangement of matter temporarily produces order in…

Heat Death

The second law of thermodynamics tells us that entropy always increases: even if some arrangement of matter temporarily produces order in one part of the world, it will leak disorder elsewhere, and the house always wins. Stand against the second law and not only can’t you win, but you can’t brake even. But entropy is also a measure of information, as Claude Shannon discovered: it is surprise. And so, in hermetic fashion, there is a microcosmic heat death of the mind to mirror the macrocosmic heat death of the universe.

Each of us is invested in surprise. The vast majority of our nerve cells, by weight, are engaged entirely in the execution of habitual responses to stimuli — sleepwalking, we make the coffee, brush our teeth, make idle conversation, cook food, move boxes, spiral down into thought loops and reveries — but we cannot fly, so weight fluctuations do not require justification from God or Nature; the vast majority of the energy consumed by nerve cells, on the other hand, goes to the cardboard-thin layer of tissue wrapping the frontal lobe, dedicated to making predictions and revising them in response to surprises. These surprises momentarily wake us out of our reveries, tweaking probabilities by moving around potassium ions, sometimes at a mass scale. A man without surprises can wither away in his certainty with a smaller supply of sugar.

Information is entropy, because surprise can roll in like an inferno and burn your whole memory palace to the ground. And then you rebuild.

The neocortex is full of independent general-purpose predicting machines. Sometimes they model bits and pieces of the world — a sensory nerve here, a motor nerve there — but mostly they model each other, because an inaccurate model of the average state of a hundred inaccurate models (themselves modelling the average state of a hundred more), while much less accurate than a single model of the thousand million fibers that touch skin and muscle, is inexpensive enough to be affordable on an energy budget of about 100 watts.

A community is full of independent people — these neocortices locked up behind walls of bone, each spending a good chunk of that 100 watts on predicting the behavior of about 100 other skull-caged neocortices. Predicting — and therefore modeling — these other neocortices can be lucrative, if you do it accurately enough. If you know what people want, you can give it to them (or offer to rent it to them, or trick them into believing you have already given it to them). Within a single neocortex, a surprising sensation rockets around this kind of spinal wall street, demolishing structures of thought, reasoning, and habit, and letting the neighbours pillage and expand the newly empty lands. Within a community, a surprising fact does the same.

Part of the way these networks of models work is to meter each other. Hook two thermostats together in a particular way and you can keep the temperature in a particular range: too hot and the AC will kick on, but too cold and the furnace will start (both of them pumping heat into some part of the environment — heat which will never go away, but instead stick around with its growing cohort until the world ends and only it remains, a cold heat in an endless and ever-expanding empty void). Hook them together in a different way, and you have a differential thermostat — a golem to play Maxwell’s demon, moving air from A into B when A is more than some number of degrees warmer. Wire up tunnels between these rooms, and move air from one to another, the regular thermostats controlling whether the differential thermostats are enabled, so as to move air to a cooler place only when it’s too hot in here. It doesn’t matter how the tunnels wind, or which rooms are connected; so long as the system is closed, eventually you will reach a steady state: all thermostats are in range, and no fans blow at all, unless heat leaks out of the building. While adjusting, there were wild fluctuations in temperature, but now every room is about as warm as every other room, and they are all slowly getting colder. This is a homeostat: whether you build it out of HVAC systems, diodes, algae colonies, or ion ratios, it will do the minimum possible to keep everything at an even keel. Use a homeostat to model something in the outside world and it will go into a frenzy of adjustment after a surprise, but eventually adjust to become silent in the face of any new normal.

Put the microphone next to the speaker and: first, a rush of noise, but within a few seconds it resolves into a clear, pure tone. This is the resonant frequency of the microphone-and-speaker system. It is always present: those sounds that happen to overlap it will get a tiny, imperceptible boost to their volume. The speakers constantly put it out and the microphone constantly picks it up. But, to hear it and to know it, you must put the microphone up to the speaker. Far enough away and it seems to disappear. Bring the microphone slowly closer, and some tones begin to fade away while others, feeding through the system, begin to dominate. The closer the two ends, the clearer the signal — but just as this resonant frequency is determined by the frequency reproduction curve of the speaker (the tweeters higher than the woofers) and the frequency response curve of the microphone, friction with the air will dampen the higher frequencies. At some distances, sudden jumps in resonant frequency: these three parabolic or quadratic functions are being multiplied, not added, and so the pattern is mercurial. At the same time, without one frequency towering so tall over others, unrelated noise from the environment has a greater relative amplitude. At a great enough distance, or in a crowded enough room, resonance is negligable within a single PA system.

In other words: homeostats are vulnerable to silver bullets. Some surprises are easy adjustments, but others involve long periods of busy readjustment. Even though homeostats always return to a steady state (assuming the average periodicity of change is lower than the average relaxation period — the more complex the homeostat, the shorter the average relaxation period against equally complex or quickly-changing patterns but the longer the maximum possible relaxation period; or in cybernetic terms, a system cannot control another system of greater variety), a complex homeostat can remain irrational for longer than we can remain solvent. Ashby predicted, through calculation, that a homeostat of complexity comparable to our brain (with its overachieving, hungry, constantly-speculating gift wrap and its dull, robotic core) could remain churning and out of balance for longer than the lifespan of the universe, were it not for the expiration of the meat (the decay of the telomeres, or a blockage in the supply of sugar and oxygen) after about a century — and it doesn’t require much input to start that chain reaction, so long as the input happens to be a silver bullet.

The neocortex is a homeostat made of homeostats, each modeling its peers, and society, that homeostat of neocortices, is equally fixating on navel-gazing. We put each other to sleep — disrupting our own equilibrium to give someone else the space to balance — and we wake each other up — kicking each other with surprise, accidentally or by design, and demolishing and rebuilding each others’ inner lives the way that seasonal wildfires clear away sunlight-blocking foliage and allow the new growths to sprout. We get closer together, because then we can predict each other better. Our microphones and speakers are thrown in a pile, but the room is too crowded: no pure tones emerge, but instead a complex polyrhythm, prone to sudden changes in tempo, sung in twelve billion voices at once. We approach white noise.

White noise on one level is not necessarily white noise on another. Deterministic systems can approximate randomness (though they cannot be truly random); a pseudo-random number generator can latch on to a mustard seed of true entropy and produce a whole forest worth of near-perfect simulated entropy, so just because some part of a system is ordered enough to behave deterministically, that doesn’t mean that the system as a whole is not producing white noise, and to reverse the implication, not all systems producing white noise are composed entirely of white-noise-producing components. And, along every stratum — in every level of abstraction — the ambient entropy of imperfect transmission and inaccurate simulation pushes the system from both the microcosmic and macrocosmic sides. The computer’s design is theoretically perfect, but a cosmic ray conspires with the affordances of a cluster of interconnected transistors to flip a bit, and thereby is the whole machine colonized by chaos.

As the complexity of the homeostat grows, the average relaxation time for modeling some system with higher internal variety than itself approaches zero; likewise, as the complexity of the homeostat grows, the maximum relaxation time approaches infinity. A silver bullet event only needs to happen once to produce internal chaos to outsurvive the physical integrity of the homeostat device itself. Over a long enough period of time — or, we can say, with a sufficient number of events — silver bullets are inevitable.

We have silos between our abstraction layers, of course: our neocortices are imprisoned in our skulls, and can only communicate with each other through high-noise channels like speech, gesture, and signs — and even then, only through complex ad-hoc symbolic coding systems like language, themselves learned through prediction mechanisms and themselves encoding a prediction system (the zipf distribution in word frequency, mirrored in word length, and again in semantic distance, and again in the proportion of the population familiar enough with each word to use it). Likewise, neocortical homeostat circuits are stacked vertically, then grouped into columns, the columns grouped into clusters. Likewise, our community is in a village, the village in a town, the town in a county and the county in a state, the state in a country and the country on a planet. This siloing allows a greater number of layers of abstraction, with each stratum a whole system and a collection of systems: the lower you go, the lower the variety of each homeostat, but the entire mass below one stratum border has enough variety to model most of what’s above, even if it most components are invested entirely in monitoring each other. This is not neural net layering but the chaining of entire nets: the choke points between nets slow the spread of chaos, deferring the inevitable descent into white noise.

This is the heat death of the mind: the set of all information systems, having spread all surprises roughly equally across the universe, spins down as across each layer fewer and fewer adjustments need to be made, until the only movements are the cracking of the slowly solidifying ice of a world that knows itself as accurately as possible and is therefore incapable of conceptualizing things like ‘meaning’ or ‘communication’. Its work is done, after tossing and turning for trillions of years, and it can now rest forever. A solid block of white noise: a silver bullet for awakening some other universe.