Naturalistic signals are better for the soul

December 10, 2023

For those that have depth of knowledge greater than my capacity for understanding, I apologize in advance for the bastardization of the ideas you hold most sacred. Onward.

Often space from technology is to be desired. Separation, isolation, distance all have their redeemable qualities. In our health, in our communication, in life. But what about togetherness, connection, and symbiosis? These too share favorable meanings.

Information

When we look at communication we observe information, verbal, non-verbal being transmitted from one to another. In Shannon’s information theory, it’s messages being delivered down a channel. A stream of information transmitted from an encoder to a decoder or receiver. The encoder decides (or doesn’t) how to say the message, or encode the signal, and the decoder interprets it. Often we talk of “lost in translation”, which is just a signal not being decoded true to form.

Enter noise. Noise distorts the signal. Noise causes errors. The size of the channel, or the channel capacity, determines how much information can be communicated. In a noisy channel, the maximum rate at which information can be transferred is reduced to make room for the noise. Noise can take many forms, such as audible sound, packet loss, information redundancies, and cognitive bias. Imagine congestion on a freeway with an accident impeding the rate of travel. If our message is the desired signal and noise is the undesired signal, the signal-to-noise ratio is our measurement of how much desired information we’re getting within the maximum channel capacity. The maximum amount of runway, lanes, space.

Digital Information

With computing, we understand that we start with 1s and 0s and end up with the complex systems of today. These bits, binary digits, or discrete variables are representative symbols for true or false, yes or no, on or off.

Take the letter Q. Q is the 17th letter in the English alphabet and may be represented by the number 17 in the decimal system. In binary, with those 1s and 0s, we encode Q as 01010001, U as 01010101. Combinations of these values or symbols are used to represent the very language on this page. The words are encoded as bits and transmitted across a channel to you, the reader.

As each bit is received by the decoder, the probability of a specific outcome is considered surprise. In the case of binary it’s 50%. It’s either 1 or it’s 0. Surprise! In isolation, binary digits aren’t very telling, they contain little in terms of measuring information. But in sequences, the probability of the overall set or average amount of surprise is information entropy. The more information contained within the message, the more bits, the more uncertainty, the greater the entropy.

Through this stochastic process, selecting the next word in a sentence from a set of words or potential likelihoods is achievable. Remember those fill in the blank quizzes? If we can assign a probability to each bit, we can to each word in the English language. This probability is the likelihood that it’ll fill the blank but specifically follow the word before the blank or precede the one after.

Think of the simple sentence, “The athlete runs in a blue shirt.” Our uncertainty is greater than if stated “The athlete runs.” We have more information, more bits, more entropy. The common information in these two sentences is what is known as mutual information. If we know the second statement, the only uncertainty in the first sentence comes from “in a blue shirt.”

Information as Meaning

The more mutual information or context we have, the less uncertainty, surprise, or entropy in the message. Context is a key that enables the decoder to unlock the meaning of the message. With knowledge of the English language, a reader can retrieve the semantic meaning.

Context enables us to take these discrete concepts and plot the relationship or relevance between them in information space.3 An individual deeply knowledgeable in a subject would have a greater depth of understanding or number of potential answers to a question to select from. They would know the distance and relationship between these related ideas. A picture may be drawn in their mind.4

“The idea is that if you make a list of features of chairs, having legs, having backs, and what not; then some chairs are backless, some chairs have no legs, and so on. So if we accept these (for the sake of argument) as simple “yes-or-no” characters, then over a long experience of the word “chair” we should build up a concept of “chairfulness”, which could be defined by the proportions of different characters in the ensemble of all chairs experienced.” – Donald M. MacKay 3

With the word “quarter,” what is the likelihood in the context of an American football playbook that the transmitter will be referring to the “quarterback” as opposed to the “fiscal quarter” of a financial report? Selecting from a set of already built up possibilities allows the decoder to reduce the entropy or uncertainty of the input message. This selective function, choosing from a set of experiences or current state, is the meaning of the message.3

Receiving a message modifies the ability to act on the information or our state of conditional readiness. Information enables us to take action when an appropriate event arises. Just as we reduce the entropy of a message, changing our state of conditional readiness decreases entropy in the broader human experience.3 The second time the message “Help!” is received will have a different semantic meaning than the first. The information is the same but the meaning has changed because our individual state has already been modified by the first message. We may view the second message as increasing (or decreasing) our willingness to respond. Information is an instrument for affecting events in the real world.5 The greater information we have, the more meaning we may derive.

Information on the Brain

Human brains, specifically neurons, react to sensory information by encoding signals or stimuli that are received across a channel. The rate of neuron firing or spiking allows us to interpret the speed and efficiency of information transfer and computation in the brain. Combinations of spikes and lapses over time are considered spike trains. These spike trains are groupings of nerve impulses or action potentials of the cell.6 Imagine a heart rate monitor at a hospital bed with each heartbeat representing a “spike” and the grouping of them representing the “train”. Each action potential is our 1 or 0.

Through measurement of these spike trains, it’s been found that information transmitted from naturalistic stimuli is carried 2 to 6 times better than an artificial variant. Images of nature and sounds from our environment are received at a higher rate of information transfer. The coding mechanism is effectively tuned to match the signals and the signal-to-noise ratio on average is higher. A neuron’s ability to encode and decode messages is 20% efficient with artificial signals as opposed to up to 90% with the natural.7 We simply process naturalistic signals better.

Continuous Signals

Often real channels have continuous, rather than discrete, inputs and outputs. Naturalistic signals are inherently continuous. It’s not 1 or 0, it’s somewhere in between. Continuous variables like a line can take infinite values within a range.8 Imagine changing light measured over time, as opposed to a strobe light that is either on or off. These variables can convey infinite information with no space between points.

Analog sound is continuous. Digitizing audio is sampling from different points in time or amplitudes at certain frequencies. Turning that wave into 1s and 0s, discretizing the continuous signal. The less points sampled along the line, the smaller the audio file. A line turned into a dotted line containing less information. With continuous inputs, we reach the practical limits of what is acceptable by the receiver. A theoretical, yet infinite audio file or one as large as the brain can encode.

Naturalistic Stimuli for Meaning

If processing more naturalistic stimuli could lead to a greater information gain and more information or context enables the reduction of entropy in the real world releasing more meaning, why are we not actively orienting around these principles?

Arguments have and will be made that more information isn’t better. There’s more noise than signal in spurts of attention-getting from the daily news.9 But maybe meaning lies in the continuum of naturalistic signals. Long form with emotion-driven stories. Movies with multi-sensory inputs and pictures that reflect the human environment. Live sound that promotes emotion like joy or sadness.10 Meeting in person for immersion in all of the available sensory inputs. Narrative about humans because stories help us not only learn but remember and reconnect with ideas later.11

We’ve spent more than 60 years edging deeper into a form of man-computer symbiosis. A tighter coupling between machine and organism.12 Only scratching the surface with blending environment and technology to transmit more meaningful information. The brain firing neurons to trigger our fingers to select the appropriate symbols representative of information is archaic. Simply typing is a roundabout way to transmit information. Encoding thoughts for a slow channel. English, the primary language of the Internet, is itself an inefficient encoding method. Redundant. like our friend Q needing to be followed by U.13 Surely the folks at Neuralink and beyond have reasoned here, but what if we can garner connection by external stimuli and reduce space before being chipped.

Developments in generative artificial intelligence and multi-modality models can enhance static input information with more sensory outputs and emotion-laden narrative to improve human consumption or decoding. Creating naturalistic signals from otherwise artificial information sources. A world where the computations are used to factor in the most effective input method for the brain, bridging the divide existing where currently written language is the connector of our man-computer symbiosis. Closing in on the ultimate reduction of space towards unlocking more meaning from our reality.

References

1 Shannon, C. E. (1948). A mathematical theory of communication. The Bell system technical journal, 27(3), 379-423.

2 Stone, J. V. (2015). Information Theory. Sebtel Press.

3 MacKay, D. M. (1969). Information, mechanism and meaning (Vol. 134). Mit Press.

4 Kenny, A. (2008). Wittgenstein. John Wiley & Sons.

5 Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation.

6 Bialek, W., & Rieke, F. (1992). Reliability and information transmission in spiking neurons. Trends in neurosciences, 15(11), 428-434.

7 Rieke, F., Bodnar, D. A., & Bialek, W. (1995). Naturalistic stimuli increase the rate and efficiency of information transmission by primary auditory afferents. Proceedings of the Royal Society of London. Series B: Biological Sciences, 262(1365), 259-265.

8 MacKay, D. J. (2003). Information theory, inference and learning algorithms. Cambridge university press.

9 Taleb, N. N. (2016). In Fooled by randomness (p. 60). Random House. Retrieved from https://fs.blog/the-problem-with-information/.

10 Goldberg, H., Preminger, S., & Malach, R. (2014). The emotion–action link? Naturalistic emotional stimuli preferentially activate the human dorsal visual stream. Neuroimage, 84, 254-264.

11 Hughes, J. M., Oliveira, J., & Bickford, C. (2022). The Power of Storytelling to Facilitate Human Connection and Learning. impact, 11(2), 17.

12 Licklider, J. C. (1960). Man-computer symbiosis. IRE transactions on human factors in electronics, (1), 4-11.

13 Roberts, E. (1999). Entropy and Redundancy in English. Bits and binary digits. https://cs.stanford.edu/people/eroberts/courses/soco/projects/1999-00/information-theory/entropy_of_english_9.html

14 Miłkowski, M. (2020). Thinking about Semantic Information. AVANT. Pismo Awangardy Filozoficzno-Naukowej, (2), 1-10.

Photo by Studio Dekorasyon on Unsplash