The past year has been the year of AI. We were blindsided by the release of radically improved large language model-driven chatbots like ChatGPT. In addition to general excitement, negative reactions range from worries that these impressive computer programs will take our jobs to outright panic that AI will "hack our human operating system" to spread misinformation, endanger democracy, and wreak general societal havoc on its way to bringing about human extinction.
Yet it is this very mythologizing, bordering on reverence, that puts us most at risk. That’s because it reckons the power of AI versus the human brain against computer benchmarks - accuracy, processing efficiency, and bandwidth for example. Here, AI will beat us every time. But the true gold standard, and the one that made us the remarkable species that created - and now must control - AI, is something computer programs will never have. The other “I”: Emotional intelligence (EI).
EI is the ability to interpret and manage emotions in oneself and others and apply that knowledge to achieve things in the world. Not only is EI essential for everything from problem-solving to cultivating relationships to coping with stress, it irrevocably shaped human evolution. Starting with Darwin, we have understood that emotions support adaptation and survival because they have two characteristics: speed and sociality. As lightning-fast packets of information, emotions energize and direct our actions to help us survive and thrive in ways that slower decision-making processes cannot. Moreover, emotions are social signals that, from infancy, we prioritize to communicate and bond - in other words, to optimize access to our most valuable resource, social capital.
But emotions alone wouldn’t have advanced humans beyond our animal cousins if it weren’t for EI, which channels the superpowers of our disproportionately large prefrontal cortex: self-regulation, planning, value-based decisions, and other executive functions. EI is what allows us to take a raw emotion like rage and transform it into the moral outrage needed to fight injustice. EI applied to our emotions is like human tool use applied to the wild, unpredictable world – it tames and leverages.
The best way to make smart decisions about AI is to focus less on our fears and fantasies and more on what we want AI to do in service of EI. Consider, for example, three aspects of EI: authenticity, intuition, and collaboration.
One marker for EI is authenticity, the feeling of being in the presence of integrity, reliability, and genuineness. As ChatGPT churns out stories ranging from deadly dull to weirdly hallucinatory, we automatically understand that we have entered an uncanny valley lacking in authenticity. It’s why, although deepfake celebrities aren’t going away, the tech that really moves us are jaw-dropping live experiences like immersive shows at the Sphere or the virtual reality of ABBA Voyage created by and for humans. AI will surely be leveraged to produce repetitive, uninspired creations for mass consumption, but the need for human creators who develop authentic works and wield AI as an aesthetic tool will only grow, as will calls for data dignity, or remuneration for the original creations fed into AI models. AI will eventually replace some artists and creators. Our desire for authenticity, however, means that as AI products become increasingly commoditized, human-created products will become more dearly prized.
AI should support another key aspect of EI: intuition. Discovery doesn’t emerge simply from the accumulation of data or the identification of patterns and connections. It emerges from human understanding, insight, and most of all intuition. Intuition is our ability to follow logic but leap ahead a few steps. It’s grappling with a question, going to bed, and waking up the next morning with the answer. Humans should always be placed at the helm of discovery efforts – scientific and humanistic - and AI tools developed to collate information, find novel associations, run simulations, and then feed it all back to humans to fuel our intuitive insights.
Above all, AI must support the bedrock of human well-being, social connection. Without close relationships, we suffer damaging loneliness. Without empathic ties, we find it almost impossible to survive the depths of human suffering or scale the heights of joy. AI must amplify human connection, but should never seek to replace it. An AI chatbot for mental health, for example, can provide therapeutic access to well-curated, personalized information, but designing AI to simulate human care will risk deepening pre-existing social and emotional problems. Moreover, AI simulation of empathy won’t support human connection because the true benefits of empathy are in the giving rather than receiving. Finally, while acknowledging that there are potentially fruitful applications of social AI for neurodivergent individuals, the use of this highly unreliable and inaccurate technology among children and other vulnerable populations is of immense ethical concern.
Albert Einstein is often quoted, “The intuitive mind is a sacred gift. The rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.” We must remember: AI is the servant. EI is the gift. We honor that gift by shifting the focus to how AI can help us be more fully, gloriously human.