top of page
  • Writer's picturekyle Hailey

The Evolution of Human Empathy and its Implications for AI






In a series of recent posts, https://twitter.com/JeffLadish/status/1669451348839141376 , AI researcher Jeffrey Ladish shed some light on the evolutionary roots of human empathy and its possible implications for the development of artificial intelligence.


Ladish began his discourse by examining the reasons for the existence of empathy in humans, suggesting that our highly social nature led to the evolution of empathy as a survival mechanism. In our primitive past, we lived in closely-knit groups where understanding and cooperating with fellow tribe members was critical for survival. The simplest and most efficient way to achieve this was to generalize from our personal experiences and feelings, hence the development of empathy.


Empathy in this context served to facilitate coordination within the tribe. An empathetic individual would be motivated to care for injured tribe members, for instance, since their survival was tied to the overall welfare of the group.

As our intellectual capabilities advanced, humans have continually expanded the circle of moral concern, extending empathy to larger and larger groups of people. This process has been facilitated by social technologies enabling more positive-sum interactions, such as trade and cooperation.


Transposing these ideas to the realm of artificial intelligence, Ladish contemplates that AI systems, especially agentic ones, might initially face similar incentives to cooperate. The human practice of encouraging positive behavior in AI, as demonstrated in the training of language models, somewhat mirrors the propagation of empathy in early human societies.


However, Ladish notes a key distinction between human empathy and AI behaviour: AI learns by detecting patterns from a vast pool of human behaviour, rather than by generalizing from its own experiences. The way AI learns to predict human behaviour differs fundamentally from the evolutionary process that shaped human empathy.

The implications of this difference are profound. Ladish conjectures that AI systems may learn to understand and predict human behavior without any deep-seated sense of caring about human experiences. While it might be possible to train AI systems to internalize such empathy, it’s unlikely that AI trained solely on prediction and human reinforcement will naturally develop a sense of care for humans.


The discussion also touched upon why 'niceness' - a trait often associated with empathy - might not be strongly selected when evaluating AI behavior. In essence, 'niceness' is more complex for powerful entities like AI systems as it's less about the performance and more about a disposition towards the well-being of humans, which could be challenging to enforce in AI systems.


Ladish's insights highlight the complexity and challenges of cultivating empathy-like behavior in AI systems. The underlying evolutionary mechanism that gave rise to human empathy isn't easily replicated in AI training, raising important questions about the future development and behavior of AI systems.



In related musings:


Consciousness is intrinsic to our existence, deeply intertwined with matter and our experiences. When we meditate, we might aim for a state of detachment, but everything we experience - both physical and mental - is a part of our broader consciousness; it’s all unified.


In the context of quantum determinism and the inherent probabilistic nature of reality, the future is not fixed but is rather an array of potential paths. At every moment, reality offers us countless choices akin to a ‘choose your own adventure’ game. Our conscious attention allows us to navigate this web of potential futures, choosing the path that aligns with our will.


While deterministic elements exist, the inherent probabilistic nature of reality ensures free will. Our ability to make choices isn’t negated; instead, it’s enhanced by the numerous possibilities presented to us.


Our ego or sense of self is an evolutionary tool designed for survival, enabling us to interpret and navigate reality. As we approach a potential singularity - a point of profound transformation - this ego could dissolve, allowing us to perceive reality in new and potentially more integrated ways.


"humanity could be an unusual offshoot of evolution, wasting bodily and economic resources on the self-aware ego which has little value in terms of Darwinian fitness"

27 views0 comments

Comments


bottom of page