In a recent conversation on The Daily Show, Jon Stewart sat down with Tristan Harris, co-founder of the Center for Humane Technology, to discuss the profound implications of artificial intelligence on our human society. Their discussion was not merely about the capabilities of AI but about its ethical ramifications and the choices we face as a human community.

Harris highlighted a critical concern: the rapid advancement of AI technologies is outpacing our ability to regulate and understand them. He pointed out that current iterations of AI have already disrupted the workforce, with entry-level jobs being replaced at an alarming rate. This isn’t just a technological shift; it’s a societal one. The very fabric of our communities is being altered as people find themselves displaced, not by their own actions, but by systems they have little control over… and AI is concentrating wealth in the hands of fewer and fewer people.
Maybe what struck me most was Harris’s assertion that the tech industry’s rush to be first to market often comes at the expense of safety and ethical considerations. In our pursuit of innovation, we may be sacrificing the well-being of individuals and human survive. This isn’t just about technology; it’s about the kind of society we want to build… and it transcends geo-political boarders. AI may be like “alien” invaders… except we created it. Are we prioritizing progress over people? Are we allowing profit motives to dictate the direction of our collective future? We seem to need a clearer imagination of a better world for all and everything.
As an educator, theologian, and one who cares about G-d’s Shalom, these questions resonate deeply with me. The narratives we construct, the stories we tell, and the values we impart to the next generation are all influenced by the tools we create and the systems we put in place. If we are not intentional about the ethical frameworks guiding our technological advancements, we risk perpetuating inequalities and reinforcing systems that harm rather than heal.
The conversation between Stewart and Harris serves as a poignant reminder that technology is not neutral. It is shaped by human hands, guided by human values, and it impacts human lives and livelihoods. Are we standing at a crossroads or crisis, either way we must ask ourselves: What kind of world are we co-creating? And who get to decide? The 1%? A handful of tech moguls? The market? The government (which one)? Will technology serve humanity, or will humanity serve technology?
Peace, dwight
