- ¶ 2 Leave a comment on paragraph 2 0
- [There is a threshold of size, complexity, and dysfunction for societal systems whereby incremental changes for improvement of components (systems/holons) cannot successfully lead to the transformation of the whole system to be significantly more functional (serving all components of the system, well).]
¶ 3 Leave a comment on paragraph 3 0 Knowledge by any component of the functioning and plans for change of all other components related to it cannot occur after a certain level of system dysfunction. Highly functional system with many components, all changing in harmony, requires seafing systems far more elaborate than we have today.
¶ 4 Leave a comment on paragraph 4 0 Long term evolution has tuned the different holons in our bodies (molecules, organelles, cells, tissues, organs, subsystems) to automatically respond to signals from elsewhere; these holons don’t have to LEARN what to do. This is not the case for complex social/societal systems. Even if we had a powerful surveillance system about every relevant variable in humankind, access to them for everyone, and ability to learn to comprehend and act upon them – sustainable and resilient social/societal systems are a challenge to create and maintain. Not impossible, but don’t expect them to simply emerge without explicit design and experimentation – and participation by most of humankind.
- ¶ 5 Leave a comment on paragraph 5 0
- Design is required if we are to act in accordance with the second law of cybernetics, Ashbey’s law of Requisite Variety.These “laws” are as restricting as the physics laws of conservation, but are violated every day by human actors.
Leave a comment on paragraph 6 0
Any successful improvement of a system component will disrupt the whole system. Today we hear about “positive disruption” leading to a cascade of improvements – all improvements (automatically) tuning-up to each other. This is wishful, magical thinking. In a highly dysfunctional system the disruption would most likely result in efforts by other components to stop the improvement.
Cascades need to be part of an explicit strategy and designed/prepared for – including consideration of the viable-Seeds / fertile-Soils / nurturing-Scaffolding conceptual scheme.
Engineered disruption can be an effective technique in special circumstances. Disaster Capitalism is a negative exemplar. Disruption is a significant factor in the evolution/emergence of technology.
- ¶ 7 Leave a comment on paragraph 7 0
- Dysfunctional systems cause components and subsystems to become more competitive and predatory with each other; and they overstress human members of these systems, indoctrinating them to a warped perspective of the whole system.
- We must distinguish between how societal systems respond to disruptive events and how they may be pro-active to strategize systemic change. Humans cooperating recovering from disasters (A Paradise Bullt in Hell – Rebecca Solnit) before authorities intervene and muddy the waters, are exemplars of powerful, native, human competencies
¶ 8 Leave a comment on paragraph 8 0 Designs might be created to orchestrate a slow and balanced transformation; but a dysfunctional system would not permit its population to learn to comprehend and act on such a plan.
¶ 10 Leave a comment on paragraph 10 0 Computer simulations of attempted transformations of systems could demonstrate this impossibility. If done, this “evidence” would not be sufficient to “move” the ruling elite. Yet, it would be useful to have such demonstrations to move change agents to seek alternatives to transformational end games.