This week, I was invited to give a three-minute flash talk at an event called "Human Development, Sustainability, and Agency," which was organized by IIASA (the International Institute for Applied Systems Analysis), the United Nations Development Programme (UNDP), and the Austrian Academy of Sciences (ÖAW). The event framed the release of an UNDP report called "Unsettled times, unsettled lives: shaping our future in a transforming world." It forms part of IIASA's "Transformations within Reach (TwR) project, which looks for ways to transform societal decision-making systems and processes to facilitate transformation to sustainability.
You can find more information on our research project on agency and evolution here.
My flash talk was called "Beyond the Age of Machines." Because it was so short, I can share my full-length notes with you. Here we go:
"Hello everyone, and thank you for the opportunity to share a few of my ideas with you, which I hope illuminate the topic of agency, sustainability, and human development, and provide some inspiring food for thought. I am an evolutionary systems biologist and philosopher of science who studies organismic agency and its role in evolution, with a particular focus on evolutionary innovation and open-ended evolutionary dynamics. I consider human agency and consciousness to be highly evolved expressions of a much broader basic ability of all living organisms to act on their own behalf. This kind of natural agency is rooted in the peculiar self-manufacturing organization of organisms, and the consequences this organization has on how organisms interact with their environment (their agent-arena relationship). In particular, organisms distinguish themselves from non-living machines in that they can set and pursue their own intrinsic goals. This, in turn, enables living beings to realize what is relevant to them (and what is not) in the context of their specific experienced environment. Solving the problem of relevance is something a bacterium (or any other organism) can do, but even our most sophisticated algorithms never will. This is why there will never be any artificial general intelligence (AGI) based on algorithmic computing. If AGI will ever be generated, it will come out of a biology lab (and will not be aligned with human interests), because general intelligence requires the ability to realize relevance. And yet, we humans increasingly cede our agency and creativity to mindless algorithms that completely lack these properties. Artificial intelligence (AI) is a gross misnomer. It should be called algorithmic mimicry, the computational art of imitation. AI always gets its goals provided by an external agent (the programmer). It is instructed to absorb patterns from past human activities and to recombine them in sometimes novel and surprising ways. The problem is that an increasing amount of digital data will be AI-generated in the near future (and it will become increasingly difficult to tell computer- and human-generated content apart), meaning that AI algorithms will be trained increasingly on their own output. This creates a vicious inward spiral which will soon be a substantial impediment to the continued evolution of human agency and creativity. It will be crucial to take early action towards counteracting this pernicious trend by proper regulations, and a change in the design of the interfaces that guide the interaction of human agents with non-agential algorithms. In summary, we need to relearn to treat our machines for what they are: tools to boost our own agency, not masters to which we delegate our creativity and ability to act. For continued sustainable human development, we must go beyond the age of machines. Thank you very much."
SOURCES and FURTHER READING:
"organisms act on their own behalf": Stuart Kauffman, Investigations, OUP 2000.
"the self-manufacturing organization of the organism": see, for example, Robert Rosen, Life Itself, Columbia Univ Press, 1991; Alvaro Morena & Matteo Mossio, Biological Autonomy, Springer, 2015; Jan-Hendrik Hofmeyr, A biochemically-realisable relational model of the self-manufacturing cell, Biosystems 207: 104463, 2021.
"organismic agents and their environment": Denis Walsh, Organisms, Agency, and Evolution. CUP, 2015.
"the agent-arena relationship": a concept first introduced in John Vervaeke's "Awakening from the Meaning Crisis," and also discussed in this interesting dialogue.
"agency and evolutionary evolution": https://osf.io/2g7fh.
"agency and open-ended evolutionary dynamics": https://osf.io/yfmt3.
"organisms can set their own intrinsic goals": Daniel Nicholson, Organisms ≠ Machines. Stud Hist Phil Sci C 44: 669–78.
"to realize what is relevant": John Vervaeke, Timothy Lillicrap & Blake Richards, Relevance Realization and the Emerging Framework in Cognitive Science. J Log Comput 22: 79–99.
"solving the problem of relevance": see Standford Encyclopedia of Philosophy, The Frame Problem.
"there will never be artificial general intelligence based on algorithmic computing": https://osf.io/yfmt3.
"we humans cede our agency": see The Social Dilemma.
Life beyond dogma!