BRUTE wrote:
in Dragline's esteemed opinion, if one were to realize that a system is complex and complicated and non-linear and fractal and all that, how would one go about inducing non-random change in that system? is there a methodology? if the consequences are unforeseeable, maybe a sort of metaphorical "wading in slowly", seeing if the resulting waves lead to what was hoped?
/Grammar nazi mode on/
These mathematical terms are not interchangeable and they actually mean something precise.
Complicated means that something consists of many connected parts. Complex means that the sum of all the parts is greater than the sum of the parts and that it's not possible to understand the whole merely by understanding the parts in isolation.
Chili con carne or con beans is an example of something that's complicated but not very complex. You could take the ingredients individually, put them in your mouth, and chew them around (thus connecting the parts) and it would taste more or less the same. The sum of the parts ~ the sum of the whole.
Bread is an example of something that's complex but not very complicated. You can't take a spoonful of flour and a spoonful of water, put them in your mouth, chew them, and have it taste like bread.
One could define the field of engineering in its most abstract way by saying that it is the field of reducing complexities to complications.
Fractal simply means that the same pattern repeats at all scales of the system. That's probably not the case with charity. The pattern that determines the organization of soup kitchens is not the same as the pattern that guides the operation of an individual soup kitchen. On the other hand, prices from the stock market often display fractal patterns. This is because e.g. small day traders and large month-by-month institutional traders display many of the same trading habits but at different levels of scale.
Non-linear simply means that the system has a non-proportional response to the input, e.g. if I double the input, the output won't be double.
Random can be defined in many ways, but one of the more useful ones is that there's no pattern. I think that's better than saying that random=unpredictable because unpredictable might just be a personal failure to see the pattern. For example, the stock market is not random. If you look at charts for more than a couple of months, it will be pretty clear that markets tend to rise slowly and crash fast (so an upside down chart would look weird). That's a pattern.
Chaotic (doesn't mean disordered like in the vernacular) but refers to systems that are highly sensitive to starting conditions. Chaotic systems can be completely deterministic and non-random. E.g. the three-body problem.
A complex adaptive system is a system of parts where the interaction of the parts creates a pattern that feeds back and interacts with the parts. It is the feedback of the pattern that makes the system complex (because the pattern arises from the sum of the parts) and adaptive (because the parts respond to the holistic pattern). If the same/similar patterns feed back at several levels, the system is complex, adaptive, as well as fractal. So the stock market as a whole is fractal complex adaptive system because the patterns are similar at all sizes of traders (from bedroom to pension funds) and because individual traders respond to the patterns they see on their trading stations.
One interesting feature of complex adaptive systems is that you can often cut out a part of it and study that and gain insight in how the larger system actually works. E.g. you can learn things about how a country functions by studying how a city functions.
/Grammar nazi mode off/
So down to it ...
Complex adaptive systems often display stages of equilibriums along with a slow trend towards a final equilibrium. In ecology, the intermediate levels are called seres and the final stage is called the climax. Evolution has generated a system that's more or less self-managing. Finance and other modern (post paleo) human inventions have not. This is probably because these systems are too complicated for human brains.
And here's the important point ..
The way to change or deal with a complex adaptive system is through
micromanagement. That's pretty much the only methodology. You literally need to be aware of every single part of the system and its connections at all times. This allows you to make small changes to steer (think cybernetics) it along to where you want it to go. This way the system will not be unpredictable. A garden is a fine example of that (see above). You need to pull weeds, water, add fertilizer, etc. in order to maintain the garden away from the ecology's natural state. If you do all that, controlling your entire system (e.g. you're in a grow house so you know the boundaries, e.g. pests, temp, ...), your garden will be mostly predictable despite being complex and adaptive. (It's important to note that just because a system is complex and adaptive, it doesn't necessarily mean that it's random and/or chaotic.)
The more complicated the system, the greater the effort required. This is why you see ever more management the larger a system grows. At some point, the cost of management becomes too high and the artificial system collapse. One might wonder why humans alone seem to have overly large brains and also why human brains aren't even larger? The human brain alone consumes about 20% of a human's daily energy usage. It might be about the maximum of what the body can support without starving.
Another way to deal with a complex adaptive system is to "engineer". That's essentially the attempt to limit degrees of freedom to those you're interested in to achieve your goal. That's also hard and if you don't get it right, the system might blow up in your face, literally, see e.g. Chernobyl.
A smarter, but rather limited way, is simply to use what's already there and proven to work. In other words, you accept the natural equilibrium levels (seres) and try to design for the ones you like. Permaculture is a good example of this.
The general problem with complex adaptive systems is that they're a lot easier to create than they are to understand. The frequency of system failures is a good indicator that we're running into limits.