Consciousness as memory encoder

The "other" ERE. Societal aspects of the ERE philosophy. Emergent change-making, scale-effects,...
boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Consciousness as memory encoder

Post by boomly »

The consciousness experiences all patterns.
The difference between a blue triangle, a yellow circle, the taste of macaroni and cheese, the sound of a sonata, the abstract concept of democracy, the smell of burning leaves, the process of turning a physical object.

Only one at a time, though.
If we try to split our attention equally between two different things, our attention alternates. If we try to focus on the background noise of a fan or traffic, and some small object in our field of view, our attention cannot hold both equally. It can only hold one pattern at a time. All other things in our environment are held in working memory. If we look at an optical illusion of "2 things in one", like the old lady and young woman, the candlestick and two faces, the stack of cube corners, we can only "see" one way at a time. Because our consciousness only holds one pattern at a time.

The brain needs a universal "language"
The brain easily combines diverse patterns in the same mental space - the same processor. For this to happen so easily, those patterns need to be in the same format, have the same encoding. What mechanism does this for the brain? What ensures that all information in the brain is encoded to the same standard so that patterns can easily be associated, correlated, combined?

All information remembered in the brain (cerebrum, at least) has gone through consciousness.
This may not be entirely true, as the brain is a slobby organic thing, but it seems true, for the most part. For example, our hearing. Large amounts of information hit our eardrums constantly, but hardly any of it is remembered, especially if we're focused on other things. When an unusual sound is heard, the auditory center needs it to be remembered - and runs it through the consciousness to encode it for memory. Before this encoding occurs, raw sound data is not available to other parts of the brain for processing. After encoding, the pattern of the strange sound can then be compared and associated to all the things that might have made the sound. The sound does not reach our attention because we necessarily need to focus on it. It's sent to the universal encoder of the brain so it can be in a format ready for processing.

The brain acts on prepatterned information before it reaches consciousness.
If the visual cortex sees a danger that is already an encoded pattern, it will initiate avoidance processes with the rest of the brain before it runs through consciousness. Also, if we walk or drive somewhere, we may go some distance without forming any memories of it. Because no new patterns needed to be encoded by the consciousness. The visual cortex was seeing completely familiar landmarks which needed no updating, and the walking/driving processes were such refined patterns, they needed no new updating. No new patterns - no need for the encoder that does it.

Unconscious competence happens similarly
The patterns are so refined they need not be updated with new ones, so they operate without clogging up the memory encoding device. With conscious competence, we are still updating and refining the patterns, so still needing to run them through consciousness for re-encoding. So each new instance of "Aha, that's a better way," or "No, not quite that," needs to be run through consciousness to encode it into memory as a pattern.

Grasping at an insight
The processing centers of the brain, working with far more information than can be run through the consciousness, will come up with a solution. At first, it will be too unwieldy to turn into a single pattern. We get this funny feeling that we know the answer, but can't quite grasp it. Eventually, the processors figure out a way to work it through the consciousness, first as a series of patterns, then fewer patterns organizing the previous ones, then a single pattern organizing those (or simpler or more complex, depending on the problem). Once the consciousness has turned it into neat patterns with a "handle" pattern, it is easily brought up to be used in processing, or further refined, or to simply operate in the subconscious unchanged by updates.

Need for sleep
Everything with a brain needs to sleep. Why would a mechanism like consciousness absolutely need to be turned off from time to time? For recalibration? Possibly, if the consciousness doesn't get recalibrated, the quality of the information it encodes gets more and more degraded until it's not useful. Sleep deprivation experiments show that brain functions decline from lack of sleep. Is it because memory encoding degrades? Possibly.

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

I am curious to see where this train of thought leads! Much of this aligns with the paradigm of consciousness I was using a few years back when piecing together the cognitive functions which could be called "modules" of the encoder in your system. One question that may arise is what exactly is a pattern? When does a pattern begin and end? I eventually approached this by bringing the patterns to the foreground and moving "activities" to the background. Though, the cool thing about consciousness or psychology is that it is centered on a super complex piece of machinery with many modelling pathways.

On one hand, I can see how we could take parallel processing to its end in the computing world, where each core is ultimately executing on one thread at a time but switches so quickly that it feels parallel. On the other hand, classic transistors are designed to avoid quantum effects whereas neurons and their sub-scale "components" may not be entirely classical (i.e. may be somewhat wave like). Based upon my limited current understanding of research on the "quantum mind", it seems as though these wave-like properties could be quite major/fundamental to whatever consciousness is.

This leads me to think the mind is more similar to a holograph in that it is fundamentally non-local (i.e. spread out). One assumption that may seem counter to this is that it takes time for the inputs to be processed, thus our experience of reality lags in time. This would seem to align more with an instrumentalist position that trust our use of instruments to segregate and measure these waves as they propagate within the speed of light. Though, it does seem like you run similar problems with that approach as you would with assuming that consciousness is infinite and immediate(*)

In terms of a universal language, as far as I am aware, Chomsky is the go to for attempting to map out a universal grammar of humans. Some conflict arising with people who gravitate more towards the Lakeoff camp that takes metaphor as fundamental. Thus leading to some interesting intersections of "original metaphor" and the "collective unconscious" where the human species may have pre-programmed ways of functioning (e.g. breathing, digestion, etc.) that act metaphorically to build new ways of functioning. Though, there are clearly some cross-cultural similarities in grammar that seem to transcend cultural differences. These two sides feel as one coin, but I am not sure how yet. Perhaps no mechanism as we currently know them matches nicely onto the dialectical solution.

Memory is also a curious construct as there are many ways to remember that appear to vaguely be localized in the brain. The hippocampus being like a hub of particulars and the cerebellum helping with fine-tuned motor control. Hence, many of the areas associated with "memory" are more localized to the center where they act more as routing systems for the surrounding "hologram" or simulated projection of experience. This also relates to another attempt at modelling consciousness, namely IIT or Integrated Information Theory, in which consciousness is a property of the non-local connections of matter. From this vantage point, you start to run into interesting thought experiments concerning the degree of consciousness of a chair relative to a squirrel or human.

Another link that may lead to questions of the mind being "blank slate" is that of the snake or spider pattern. As you can have young kids which seem to be pre-programmed to recognize and be fearful of objects that move like "creepy crawlies" without ever being taught. Hence, there may be some proto-patterns that serve as metaphors to make sense of actual-patterns.

With respect to grasping an insight, it may be so that feelings are not required to be clear or insightful so long as they move us away from the snake or bad marriage. In my system, I have feelings and intuitions mapping into the "connotative" or "between the gaps". Once you can make the pattern explicit and tangible it becomes "denotative" or "of the gaps", indicating a sensation, thought, or "neat pattern".

A holographic or wave theory of consciousness might suggest that deep sleep is a way to let reverbs run their course so as to reduce probability of interference when it matters. So, an agent may get rattled up by a snake encounter during the day, then the situation may linger on in their mind with a reverb pattern. Sleep would then help to conjure up dreams or loose simulations detached from typical waking function to help put the situation in perspective, then the mind would slowly wind down as the reverb looses momentum in the vast silencing that eventually sends up a jolt of energy to start again with a brain less likely to be clouded by the mistakes of the day prior.

Anyway, I am just ranting a bit as this stuff is fun to engage with and gets my wheels turning.

(*) This is what I tend to lend towards as the "ground" in such conversations but this was by no means where I started. It has taken a while for me to see the elegance and simplicity in this approach. Some of the intuition came from ideas in simulation theory and from the anthropic principle.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

In terms of a universal language, as far as I am aware, Chomsky is the go to for attempting to map out a universal grammar of humans.

I was thinking more on the lines of an internal universal language for a single brain.

The kinds of data the brain interprets is very diverse, yet it appears to be able to process it all with the same architecture.
For example, the data coming from the eyes is quite different from the data coming from the ears. It would be like the difference between a jpeg and an mp3.

But the simulation creator of the brain can easily manipulate them as if they're the same kind of file. If we're listening to music, we can convert the music into sights and sounds, and vice versa. People who have psychedelic experiences "blend" different kinds of patterns, experiencing things like synesthesia.

This indicates that the brain operates with a single "file format" for diverse kinds of data. The brain pattern. This brain pattern is indeed a physical thing, as neuroscientists, in experiments, can identify the specific neuronal connections associated with specific experiences of a pattern.

My question, I asked myself, was "What ensures these brain patterns are in the correct format?" It seems unlikely that the auditory center, operating separately from the visual cortex would encode a pattern in exactly the same file format for use in the simulation engine of the brain. So, the brain would need a "Universal Brain Pattern Encoder", to make sure a sound file, a visual file, an abstract concept file, would all be easily processed in the simulation engine.

When a sensory center of the brain senses a new pattern that needs to be remembered, it would ask the Universal Brain Pattern Encoder (UBPE) to create the correct file for the pattern. This would be the experience of our "attention" shifting to anything strange or new in our environment.

It would also explain the difficulties of meditation. Various processes of the brain would be constantly wanting to use the UBPE to encode new patterns.
Which is why beginning meditators experience the disconcerting feeling that "they" do not control their own attention - other parts of the brain are constantly hijacking it (to encode new patterns).

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

It seems to me as though the file format is a distributed network modification. This has at least three components:

1. Information as a difference that makes a difference to a network. So, the vibrations of air pressure make a difference to the oscillation of the ear drum leading to a difference in the network.

2. Neurons that fire together wire together. This is Hebb's rule and suggest that there are prior pathways in which the difference induced in the network can lead to a cascading of neuron firings that will tend to wire up for next time. The ringing sound gets associated with a bell.

3. The default mode network filters out what isn't desired. Large scale brain networks can be correlated together, indicating they act like a state (i.e. either on or off). Such networks will tend to filter information differently. Thus, if you are in a more alert state expecting danger then sounds you would normally brush off become salient.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

daylen wrote:
Sat Sep 11, 2021 7:48 am
It seems to me as though the file format is a distributed network modification.
Is the brain definitely not that? If it behaved like one, even somewhat, wouldn't it need a file format?

Because, the brain is constantly running simulations of reality. How could it do that if it couldn't combine different kinds of data in the same space?
The ringing sound gets associated with a bell
In our simulation space, we can easily imagine a bell ringing, not ringing, making a buzzing sound, associated with the smell of hotdogs grilling. We can convert the sound of the bell into a visual representation of waves. We can turn the bell purple, or make it invisible. etc.
All of those things take place in the same space. In the absence of external stimuli. Wouldn't those patterns need to be similar in order to easily do that?

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

Neurons that fire together wire together.
I am somewhat aware of how neural nets work. My assumption was that, an organic neural net, operating alone, would evolve to such a degree away from other neural nets interpreting completely different kinds of data, would need some outside modifier to make the patterns created fit into the larger network.


We can create patterns completely out of thin air, without external stimuli. For instance, "A purple rhinocerus whistling dixie while thinking about fascism". Those are now patterns encoded into my (and your) neurons. How did that happen without some kind of "standard" way of encoding?

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

I suppose I do not quite understand what you are getting at. As the analogy with a classical computer is just that, not a true representation. In some sense, everything can be modeled as a network and such networks can have vastly different properties.

A file on a computer has a very particular location in memory, but the brain doesn't really store "files" in this manner. There isn't really a "central processor" that everything must first go through. The processing is highly distributed in spacetime, though the "data" is in some sense always a way of connecting [in spacetime] so that reconnection is possible given similar conditions. Such conditions can be described in any number of ways, but our human striving for linear, this -> that explanations will tend to fall short of what is actually going on.

So, yes, the patterns are indeed similar as they are quite simply connections between regions of spacetime experienced as cyclic framings between points of fixation.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

Maybe I'll just describe how I came to the conclusion.

I started out with the prevailing conception of consciousness as "awareness". The more aware and understanding of things I was, the more conscious I was.

I read an article on consciousness which described how consciousness must discern between two differently colored shapes. Like a blue circle and a yellow triangle. "It doesn't need to just discern between those kinds of things," I thought, "It discerns between sights, sounds, tastes, proprioception information, abstract concepts."

I realized that consciousness must be some universal "holder of information". Most other parts of the brain only process more limited kinds of information. I even drew out some complicated diagrams with feedback loops speculating where this "universal experiencer" might fit into the information flow in the brain, where most other parts were more specialized.

I also realized, partly from experimentation, that the consciousness only holds one pattern at a time. Once you pay close attention, it's fairly inescapable. The point of consciousness holds a single "unit of information" at a time. The rest of our situational awareness resides in working memory and short term memory.

That was a weird realization. Consciousness holds only one piece of information (a brain pattern, I presumed) at a time, and that information can be any kind of information in the brain.

This was different from the "situational awareness" paradigm I had always assumed to be true.

"What would the brain need such a mechanism for?" I wondered.

I called it "single-pattern chokepoint consciousness", because it appeared that the consciousness was a chokepoint that restricted data flow down to a single brain pattern.

I spent several months with the idea rolling around in my mind.

At some point I started wondering how all the information in the brain was organized into neat little packages that fit through the "chokepoint", or "gate" of consciousness. What mechanism ensured that brain patterns were the precise "size" and "format" to efficiently be "experienced" by consciousness? Brain patterns of different information stored in all the various parts of the brain seemed to fit precisely, one at a time, through consciousness.

And, why would the brain even have such a thing? The whole thing seemed slightly useless, as so much processing is done without the aid of this weird single "chokepoint".

I turned it around. A pattern fits exactly through consciousness because consciousness is the mechanism that creates the pattern! Consciousness was not the "read device" of the brain, it was the "write device"!

This was also a weird realization, as neural nets do not normally require such a mechanism. Patterns are created on their own in neural nets, and need no "write device". Why did the brain need such a thing?

The brain routinely (near constantly, really) combines various patterns in innumerable scenarios in reality simulations. It then stores information back in the neural nets. (Yes, I realize that the "simulation engine" is distributed throughout the brain). It would simply be more efficient to have some mechanism that reaches into a neural net and "writes" in a pattern, without it ever having come through the normal sensory channels.

Very speculative, but it seems that doing this quickly and efficiently would be better aided by having a standard format for any new pattern that develops in the neural net. Some mechanism that can reach into the net and make it conform to a standard for efficient processing in the brain's simulation engine.

Contemplating this, I found that the brain processes requiring new information to be stored are associated with being conscious, and actually experiencing the pattern in consciousness. Brain processes that run without needing new patterns "standardized", run in the subconscious, and/or we have no memory of them.

The theory explains things like short term memory creation, situational awareness, the various effects of meditation, the difference between systems 1 and 2 thinking, why sleep walkers behave the way they do, why neuronal patterns "wander" around in brains, why we need to sleep, why we dream, what we dream about.

Basically, it comes from the observation that new or modified patterns in the brain require consciousness, and nothing else in the brain does. The whole "file format" idea was just speculation about why consciousness was present at the creation or modification of a pattern.

jacob
Site Admin
Posts: 15980
Joined: Fri Jun 28, 2013 8:38 pm
Location: USA, Zone 5b, Koppen Dfa, Elev. 620ft, Walkscore 77
Contact:

Re: Consciousness as memory encoder

Post by jacob »

You guys might find https://www.amazon.com/Gestalt-Psycholo ... 000NPYUH6/ interesting. It's an old book (early 20th century) but covers some ideas in early brain/visual perception research that matches up with what you're discussing here and perhaps also daylen's frame-system. It's mostly focused on visual perception and introduces the concept of "visual field" (<- now seems like an obvious thing, but it isn't and wasn't) and "perception organization". For example, what makes the mind capable of seeing a square made out of four lines? Why do all humans of all ages including all mammals see XX XXXX as two groups of 2 and 4 (not 3 groups of 2)? Is grouping an innate pattern mechanism? Seemingly so.

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

Interesting. I would be curious to hear your thoughts on how these conclusions can explain the stuff you mentioned. I can feel/intuit much of what you are saying as the structure of your thoughts seems familiar, but I think there may be some interesting differences in word-choice. Here is how I might put it in attempted alignment.

In terms of the figure-ground distinction in Gestalt psychology, I would say that consciousness is the background of all other "ground". That is, if nothing seems particularly salient then that approaches base consciousness. Attention being a mechanism that brings figures into the foreground relative to some background. A fit being a point/frame combo that encapsulates this relativity (using an intuitive notion of information or complexity). This would align with your idea of a choke-point only processing one fit at a time. An actuator being a dual-fit that switches quick enough to feel like a flow. All the while this is happening, consciousness is still the ultimate back-drop, being forever out of grasp. Like trying to grasp air, the harder you try the less you "contain".

Also "fits" in with the concept of senses being one derivative removed from attention. As we attend to coldness in relation to hotness when the rate of temperature change is "quick". As the transition between "patterns" feels punctual in a situation like exiting your warm home to make snow angles outside. Though, if you resist the temptation to attend closely then it diffuses into a more generally conscious state in which the hot->cold transition is peripheral and thus less "painful". A more extreme example being humans that sacrifice themselves for their ideology by burning themselves alive without flinching. Pain arises from the seemingly punctual transitions of attention and not from consciousness itself.

So, then perhaps attention writes to the device when patterns are bridged initially (i.e. a punctuation or change in brain topology) and consciousness in a more diffuse sense writes gradual changes to the device (i.e. a strengthening or weakening of connection between prior bridges).

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

I'm thinking that what I'm calling consciousness, you might be calling awareness or attention.

On dreaming

The supposition is, the brain has a memory encoding mechanism that ensures new memories formed meet a standard, so they can be efficiently processed (by such processes as the simulation engine). The activity of this mechanism is what we are aware of in (or on, or against?) consciousness.

But, the mechanism, which makes sure new memories meet an exact standard, is part of an organic neural net. As it works, encoding memories, it changes, itself. The encoding standards drift. If the standards continue drifting, the simulation engine wouldn't be comparing apples and oranges, it would be comparing apples and gr%ffl!ftg.

What to do?

Turn off the mechanism. With this, the activity in consciousness ceases, and the organism sleeps.

Find a good strong reference memory. One that the brain has thought a lot, and maybe has multiple copies of. And also has been around for a while. Produce a copy of the reference memory with the encoder, and see how it compares to the original. Tinker with the mechanism. Produce another copy, and see what the simulation engine does with it. Tinker with the mechanism. Keep producing copies (into the simulation engine) and tinkering with the mechanism until the copies match the original reference.

The memory encoding mechanism is now ready for another day of standardizing memories!

A common dream that people have is of losing their teeth ( I've had it a few times). Most people, as kids, got a loose tooth, thought about losing it, then it fell out. This process repeated itself 20(?) times. So, they thought a lot about losing their teeth.

A very good reference memory for the memory encoding mechanism to copy into the simulation engine.

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

So then you have a naturally forming hierarchy of copies=fits with the most reliable/original being near the top. Perhaps then there are multiple hierarchies (forming a holarchy) for various different themes or frames across scale. Where something like the body is the frame for the experience/point of loosing a tooth (i.e. being of higher entropy relative to body during event). Another common theme/frame could be a social setting where you are inappropriately nude, the experience/point being the embarrassment. As agents mature, they will tend to form many more of these fits/copies with the more probable rising locally in the holarchy. Other fits will be faced and thus seem less punctual. Though, distant locations with vastly different frames/themes may rarely conflict with each other to a pathological degree. For instance, a severe form of anorexia could be caused by an inability to see the body as a holon and instead cross-correlating sections of the body with beauty culture at large. The sections of the body each being associated with points of imperfection (i.e. zits or moles), and the culture at large being associated with points of perfection (i.e. magazine models or flamboyant lifestyles). Such fits are attempting to actuate together while excluding the middle, and thus failing to account for bodily health as a whole.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

I'm wondering if holarchy isn't more map than territory.

The mind seems to form a web of associations, and we tend to think about that messy web as a holarchy, when it may not be.

For instance, our mental concept of "foot" might be more closely associated in our minds with "shoe", "running", "kicking ass", but when we think about how we organize things, we automatically (unconscious competence) sort the messy web into holons. "Foot is a holon of the body, a toe is a holon of the foot", would be a temporary arrangement, caused by meta-thinking.

daylen
Posts: 2538
Joined: Wed Dec 16, 2015 4:17 am
Location: Lawrence, KS

Re: Consciousness as memory encoder

Post by daylen »

Aw, yes, so it would seem to me that the roots and fungal connections penetrating into unconscious competence are quite tangled in a way that links shoes with running. The branches above emerging from initialized fits/copies (attempts at containment) form the superficial forest of the conscious competence reaching for/into the conscious incompetence. Trees are dense with association and tracked via bifurcation, and the holarchy is an overlay of suspected/expected containers that attempt to keep such organized. Though, the mind tends towards breach of containment whereby the messes fuel the underground plot to overthrow surface level association.

jacob
Site Admin
Posts: 15980
Joined: Fri Jun 28, 2013 8:38 pm
Location: USA, Zone 5b, Koppen Dfa, Elev. 620ft, Walkscore 77
Contact:

Re: Consciousness as memory encoder

Post by jacob »

boomly wrote:
Mon Sep 13, 2021 11:51 am
I'm wondering if holarchy isn't more map than territory.
Given the focus of this thread is there still a difference between the two? A holarchy is an organizing principle that defines a level according to the impedance with its neighbors at the floor above and the floor below. It's an object-oriented is-a organization. The internals of the previous floor doesn't matter---the previous floor becomes particles to the next, always.

It's thus a relative way of understanding things: Everything is defined locally. The sense-making is always local. E.g. atoms are defined in terms of nuclei and electrons. Molecules are defined in terms of electron clouds. Proteins are defined in terms of strings of molecules. Enzymes in terms of interactions between locations on those strings. Enzymes do not need to know atomic physics.

Consider another organizing principle which is that of quantum field theory. Here's a popsci version.
https://getpocket.com/explore/item/why- ... inevitable
The driving idea is that one (the human subject) asserts symmetry as a property of the universe. Very simple properties such as the way one rotates observation in space (note this also asserts properties about space time) means that particles can ONLY interact in certain ways. E.g. maybe there are less than 4 different ways to do this. It so happens that in reality, particles really ONLY do interact in exactly these ways. This is NOT a holarchy-map. This is fundamental to the territory. The symmetry principle provides the key.

The mammalian brain seems to have a evolved to recognize certain aspects of reality such as grouping, brightness, and size/distance relations. It's also good at reading expressions on other mammals (even babies can do this). Indeed, optical illusions screw with this pattern matching. It is particularly hard for the brain to see e.g. the young woman/old woman illusion simultaneously. Not impossible but hard. A brain that wasn't constantly trying to "get a lock" while being incapable of locking two targets would not be so confused.

Compare e.g. the ideas behind TWS and RWS radar modes.

This does suggest that thinking beyond "naive realism" is pretty hard; that spontaneous human pattern-matching is limited. IQ g-factor seems to measure this ability specifically and directly.

Now a question to contemplate. How many patterns are there really [in the universe]? About 4 forces of nature? 100 mental models in the lattice work? To the "external collective" it looks like there's an s-curve effect? Is there really though?

jacob
Site Admin
Posts: 15980
Joined: Fri Jun 28, 2013 8:38 pm
Location: USA, Zone 5b, Koppen Dfa, Elev. 620ft, Walkscore 77
Contact:

Re: Consciousness as memory encoder

Post by jacob »

BTW one of the SD assertions is that the human brain innately possesses the ability to manifest all the colors. It's just that a particular human (internal individual) or culture (internal collective) has not been in an environment conducive to manifesting [the next color] (yet).

As such individual and collective evolution happens slowly and typically in a daisy-chained fashion (requiring transcending the previous level of complexity).

Another assertion I found fascinating was that Tier 2 repeats Tier 1, e.g. yellow is complex beige, turquoise is complex purple, coral... then is complex red, ...

This suggests "where to look" to see where coral is. The search for an aesthetical (which I consider blue) would happen after coral.

Maybe this should be its own thread.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

jacob wrote:
Mon Sep 13, 2021 3:32 pm
The mammalian brain seems to have a evolved to recognize certain aspects of reality such as grouping, brightness, and size/distance relations. It's also good at reading expressions on other mammals (even babies can do this). Indeed, optical illusions screw with this pattern matching.
There seems to be a bias in psychological and cognitive sciences toward studying just visual information. Maybe because human intelligence is linked to a more highly evolved visual center, and the people studying these things are quite intelligent, used to turning things over visually in their minds.

Some patterns that do not feel to be part of some innate holarchy (to my mind, when I think of them)

Verbs, like the concept "to orbit" (I had to think a little bit to realize it was a verb).
Smells, like the "smell of movie popcorn".
Sounds, like the "sound of tires squealing"
Something like "magic". (I'm not sure how to classify the concept. If I thought about it a while, I could fit it into some organizational scheme, indicating that it's not innately in one now.)
Amorphous physical things, like "coffee".
Prepositional relationships, like "on top of"

But most visual patterns do seem to be organized into a holarchy in my mind. If I think of a car engine, it's associations to both the constituent parts, and also the entire car are the closest associations that naturally come to mind.

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

jacob wrote:
Mon Sep 13, 2021 3:32 pm
To the "external collective" it looks like there's an s-curve effect? Is there really though?
Could you elaborate on this question? I'm not sure what you're asking. (actually, I don't have the foggiest clue.)

boomly
Posts: 36
Joined: Thu Mar 11, 2021 9:53 am

Re: Consciousness as memory encoder

Post by boomly »

I just realized we're talking about two slightly different things.

There's a difference in pattern creation and pattern organization. Pattern creation must necessarily be holarchical. The mind either has to integrate or differentiate previous patterns to create a new one.

But

It can then go on to forget the old constituent patterns, or the larger cloth it was cut from. And, it can be organized in any number of ways.

An analogy. You cut a cookie out of a sheet of cookie dough. Or, you take several smaller cookies and press them together into a cute figure. The creation has to be holarchical. But, you can then throw away the sheet of cookie dough it was cut from and associate the single cookie with a screwdriver. You can also completely forget the original constituent pieces of the figure cookie, and consider it as only one thing.

On some technical level, you may be able to argue that any organizational system is holarchic, even if the predominate way it's used is not obviously holarchic. Basically, what I'm saying is, it might be a mistake to use only this paradigm to look at the organization of the mind.

jacob
Site Admin
Posts: 15980
Joined: Fri Jun 28, 2013 8:38 pm
Location: USA, Zone 5b, Koppen Dfa, Elev. 620ft, Walkscore 77
Contact:

Re: Consciousness as memory encoder

Post by jacob »

boomly wrote:
Tue Sep 14, 2021 4:31 am
Could you elaborate on this question? I'm not sure what you're asking. (actually, I don't have the foggiest clue.)
The "collective external" constitues the infosphere and the infosphere seems to be limited in terms of the insights it can or has produced. For example, there are about 100 different "mental models" in the lattice work which are replicated/discovered (often independently) in different fields of study. It's been like that for a while. Mentally we're supposedly not all that different from humans thousands of years ago. Most research/knowledge-work is filling out the details of very old ideas. We just do it faster/more efficiently.

If humanity has maxed out the ability to comprehend, does this mean that the world simple enough for [at least some] humans to understand or does this [failure to advance] imply there's more but which can never be understood.

Post Reply