Open Source vs Commercial: How "Winning Culture" Has Made Us More Vulnerable | Greg Epstein

July 8, 2025
  • copy-link-icon
  • facebook-icon
  • linkedin-icon
  • copy-link-icon

    Copy URL

  • facebook-icon
  • linkedin-icon

Silicon Valley's shift from collaborative open-source principles to winner-take-all commercial dominance hasn't just changed business models—it's made us fundamentally more vulnerable. When companies prioritize winning everything over building secure, collaborative ecosystems, we all pay the price. But there's a profound irony: the more desperately these leaders chase absolute victory, the more they reveal themselves as losers of the most important game—building meaningful human communities.

In this episode, Harvard and MIT Humanist Chaplain Greg Epstein explores how tech's false prophets have led us astray and, more importantly, how we might find our way back to building human-centered security that actually works. The strongest security has never come from building higher walls—it comes from creating ecosystems where everyone's success strengthens the whole. When we understand how to work together better, we all create better security.

What You'll Learn:
• How winner-take-all thinking creates systemic vulnerabilities
• Why collaborative open-source principles build more resilient systems
• The hidden security costs of commercial dominance
• Practical strategies for building multi-stakeholder security
• How to shift from competition to collaboration in your organization

Watch this episode to discover how changing your approach to teamwork and partnerships can dramatically improve your security posture.

About Greg M. Epstein: Greg serves as Humanist Chaplain at Harvard University and MIT, and spent 18 months at TechCrunch exploring the ethics of companies shifting our definition of humanity. He's the author of "Tech Agnostic: How Technology Became the World's Most Powerful Religion, and Why it Desperately Needs a Reformation."

Timestamps: 

00:00The Corruption of Winning Culture
02:39The Role of Community in Security
05:44Navigating the Media Landscape
08:20The Algorithmic Influence on Information
11:01The Cult of Personality in Tech
13:44The Messianic Figures in Technology
16:24The Fall of Tech Prophets
19:15The Importance of Losing
21:44The Future of Technology and Humanity
24:29The Need for Ethical Technology
26:56The Role of Men in Modern Society
29:39The Impact of AI on Society
32:15The Cult-like Nature of Tech Culture
34:54The Importance of Human Connection
37:43The Future of Humanism in Tech
40:11The Path Forward for Technology and Humanity
 
 

#TechEthics #CommunityBuilding #DigitalSecurity #TechCulture #HumanistChaplain #SiliconValley #TechReformation

View full transcript

 

Justin Beals: Hello, everyone, and welcome to SecureTalk. I'm your host, Justin Beals. 

We live in a culture that has become dangerously obsessed with winning. Not the healthy competition that drives innovation and lifts entire communities, but a toxic zero-sum mentality where success means dominating everything and everyone around you. 

This corruption of winning culture has infected our most powerful institutions, turning Silicon Valley leaders from community builders into digital conquistadors who view human impact as inconsequential compared to the scale of their economic kingdoms. Listen to today's tech titans and you'll hear language that would make a Roman emperor blush. 

They speak of winning at everything, of accumulating trillions, not just in dollars, but in digital beings they'll create to serve their vision of the future.

They've abandoned even the pretense of making the world better, instead of embracing a masculine bravado that confuses dominance with strength and conquest with achievement. But there's a profound irony here. The more desperately these leaders chase absolute victory, the more they reveal themselves as losers of the most important game. The one where we build meaningful human communities together. 

Real security, the kind that has sustained civilizations for millennia, has never come from individual dominance. It comes from creating conditions where your neighbor's prosperity is tied to your own, where shared success builds the kind of trust that makes external defenses largely unnecessary.

This isn't naive idealism, it's practical wisdom. The locksmith, the police officer, and the cybersecurity professional all exist because we sometimes need external protection. But the communities that thrive over generations are those where people know their fates are interconnected, where helping others succeed creates a web of mutual support far more resilient than any wall or firewall.

Today's conversation explores this fundamental tension between the corruption of winning culture and the path to true security through community. We'll examine how text false prophets have led us astray and more importantly how we might find our way back to building the kind of human-centered security that actually works. 

Our guest brings a unique perspective to this challenge. Someone who has spent decades thinking about what gives life meaning while also working at the heart of Silicon Valley's most influential institutions. Greg Epstein serves as the humanist chaplain at Harvard University and also serves the Massachusetts Institute of Technology as humanist chaplain.

For nearly two decades, he's built a unique career as one of the world's most prominent humanist chaplains, professionally trained members of the clergy who support ethical and communal lives of non-religious people. More recently, Greg's 2018 move to join MIT, in addition to his work at Harvard, inspired an 18-month residency at the leading Silicon Valley publication, TechCrunch, in which he published nearly 40 in-depth pieces exploring the ethics of technologies and companies that are shifting our definition of what it means to be human, often in troubling ways.

Greg's book, “Tech-agnostic, How Technology Became the World's Most Powerful Religion and Why It Desperately Needs a Reformation”, was one of my personal favorite reads of last year. In 2005, Greg received ordination as a humanist rabbi from the International Institute of Secular Humanistic Judaism, yields a BA, religion in Chinese, and an MA, Judaic Studies, from the University of Michigan Ann Arbor, and a master's of theological studies from the Harvard Divinity School. And he completed a year-long graduate fellowship at the Hebrew University of Jerusalem. Please join me in welcoming back to SecureTalk, Greg Epstein.

—----

Justin Beals:  Greg, welcome back to SecureTalk. We're really grateful to have you join us again.

Greg M. Epstein:  Great to be here. Thanks for having me back. Yes, this is one of the first times I'm doing a follow-up talk from interviews I conducted when my book was released. So glad to be here.

Justin Beals: Yeah, it's been nine months a year since we last got a chance to chat. You've been on the road doing a lot of tour for Tech Agnostic, one of our favorite books. Have you survived the road tour okay?

Greg M. Epstein: It's interesting, the media has been basically destroyed in the last few years. And so talking about a book in general now felt very different to me than my previous book, which is a little over a decade ago now and very, very different media landscape. And so there's just a lot of different conversations that you have to have now to do talking about a book. 

And I wonder if that doesn't speak to some of our own, some of the confusion that all of us feel, like what does it even look like to take in the world to try to be knowledgeable about it, to understand the world that we're living in. 

When the media is so diffuse, it's either like a death by a thousand cuts, like you can try to subscribe to a hundred newspapers, know, follow a billion podcasts, or what you can then end up doing if not that, you know, unless you're like, let's say you can also be like a New York Times devotee or whatever, right? Where you're just like, I'll read everything there. And that's pretty much all I'll read, which is like my mom. 

But not too many people that I know these days are doing that, right? My mom will still send me clippings from the New York Times physical paper. But then if not one of those two options and you actually care and you wanna be informed, you're basically letting the algorithm guide you, right? 

You just sort of go online, and you allow the spirit of the algorithm to move you and to send you its blessing in the form of whatever news story, you know, it thinks that the world has cooked up just for you. 

And that's a pretty scary place to be. You know, I was driving with my family the other day and we're listening to some music. We put on something that one of my kids requested. My kids love rock music. And then,  my wife and I decide we wanna skip forward a couple of songs. And my son says, No, no, let the algorithm do its thing. The algorithm is good, it's good. And it's like, that's a scary thing to hear. But that's the position that we're in. That's the big takeaway that I have from getting out there and promoting a book that's about algorithms is the extent to which we're all deep, deep, deep in their waters right now.

Justin Beals:  It does feel flipped a little bit, like I'm constantly reading how to impact an algorithm or to get, know, from just from a marketing perspective, be seen essentially through the lens of the internet. And I think you're right, like it's become so easy to put information on the internet that we can't consume, you know, all the barriers to distribution have kind of fallen through, and now the landscape is wide open. A  little terrifying because we're hiring algorithms to filter that for us at the end of the day.


Greg M. Epstein: Right, you one of the things that I get the sense of is that any kind of virality at this point is highly manipulable, right? So, you know, one of the things that I've been told as an author, as somebody who will occasionally go out there and try to put my own voice out there, like I used to be the kind of person, again, before algorithms, where I was just like wanting my ideas to be heard and seen and consumed. Whereas now I find myself to be much more of an introvert, where it's like, yeah, I'll have a meaningful deep conversation with you, but after that, can I like log off and not have to think about whether or not people have heard of me? 


And there's this sense that, sure, something will go viral, but how do you know that those people didn't just pay massively for that, right? Even if it's a mainstream media story, you know, how do you know that really expensive publicists weren't involved deeply in that? If it's something that has a billion clicks or a ton of followers, like, how do I really know that those weren't paid for by a click farm? You don't.


You know, it's like you can kind of trust that something good, but it's very difficult now. And again, the thing that I'm talking about there about like the trust the algorithm, part of it is like, you know, it's learning to understand the algorithms or whatever, but it's also, it's part of the sense that people are trying, you know, starting from my perspective to adopt more and more, which is just like, I don't know, I don't know what I should be consuming. I don't know what I should be following. It's so much out there. So I'm just going to trust that the algorithm is going to show me the way. 


And if I'm sounding like I'm talking about God and spirituality, that's very much intentional here. It's like people have in the past reduced their stress, reduced their cognitive load, if you will, by trusting that, say, Jesus would show them the way, right? That the world would be, you this big stressful world with, you know, with, with crime, with injustice, with, scarce resources, with whatever, right? And it's like, Jesus show me the way. And now, you know, we have some of the same kinds of things, but it's like just algorithm show me the way, show me what I need to know, show me where I need to be, show me who I need to talk to, show me who I need to listen to, because I can't figure it out myself and I need to surrender to it and surrender, I'm drawing that term from Islam, of course


Justin Beals: Yeah, it's it feels like a shift from a cult of personality scenario where you're going to follow a prophet or follow an icon to literally, you know, allowing the filtering of your worldview through some form of archetype. And we have a little bit of brand affinity, like Twitter is more my brand. Facebook is more my brand. You know, but fundamentally, you're subscribing to this algorithmic filter as opposed to an editorial board, which feels like what we used to have at something like the New York Times.

Greg M. Epstein: Yeah, I mean, even with Facebook, right? So you would kind of settle into that. This is 10, 15 plus years ago now. I mean, it's a generation ago now. For those of us who are kind of old like me, I'm late Gen X, just before the millennials, but it's enough to feel old these days. Anything above being in college yourself.

You kind of, you're kind of looking at this stuff bewildered, but like Facebook,  you know, you would make that your digital home. And they at least had to purport to sort of care about you as the user, right? This is something that I talked about a lot in the early publicity for Tech Agnostic, the idea that when I was a congregational organizer,I wanted to bring the world closer together for non-religious people and allies, but that wasn't my phrase, that was Mark Zuckerberg's phrase, right? 

That they really, they had to act like congregational or community organizers in this way that really struck me as somebody who was literally doing that. It was like, whoa, Zuckerberg and his whole crew, they're using the same language that I'm using. What's up with that? Whereas now, I mean, it's just sort of transparently like you do not hear that from them. There is no such effort. There is no interest in that whatsoever. You go on Facebook, I keep mine open because it's like, it's just sort of interesting to see what it's become. It's just this algorithmic slop.


You know, it's just like, we're gonna show you a video and we're pretty sure you're gonna like this video and we're just gonna show you more and more of it in the hopes that you'll spend time here so that we can sell advertising to you. And you know, it's kind of crass, like that's what we're doing here until we create God and in which case, you know, at that point we'll inform you that we've created him and then you can begin to worship.

I mean, that's all that site is good for now. It's basically, you know, it's still like a, a trillion-dollar site or half a trillion-dollar site or whatever. And it's just nothing real is happening there.

Justin Beals:  I mean, the snake is eating its tail. It's such a bizarre story when I think about Facebook and them getting caught essentially inventing individuals on their platform that are AIs that behave like people you would interact with to keep you clicking and interacting with them. know, Facebook used to have a concept where their greatest value was the fact that they had real people connected.

And they've jumped the shark so hard that they had to invent people to keep the cycle of intention gathering going.

Greg M. Epstein: Yeah, yeah, of course, there was the scandal, and I can't even remember all the details. There's just been too many scandals, Justin, but this was the one where they had the black person that they had invented, but they didn't actually consult any black people in the creation of this person. It was just like, yeah, we just kind of made up what we think somebody would act like, right? Because we know. And yeah, I mean, it's the thing that really strikes me, and again, I can't, I'm a chaplain in my day job at Harvard and MIT, and I have been at Harvard for 21 years now, and I've been serving non-religious people, of which I am one myself, so it's sort of a unique career. 

And one of the things about a chaplain is that specific people can come and talk to me, and I'm not allowed to pass on any identifying details from those conversations whatsoever, but I can give you sort of general impressions from over 2021 years. And one of the general impressions that I have of like lot of the people who've gone into, you know, that company in particular, companies like it, and with real notable exceptions, it's like there were a lot of people that went into that field of creating social media for billions of dollars that just they themselves really struggled to relate to other human beings. They themselves really struggle to feel any kind of compassion or warm feeling, you know, at the end of the day in the sort of peaceful, quiet of their own room or their own home. Any sort of love for themselves and therefore for most other people.

And so, you know, the best that they could do in many cases is just this feeling of like, what I can do is win. I know how to win. But you know, as a kid, I was taught to win and I did it. And sometimes I didn't do it, but in other times, when it was most often when it was like nerdy contexts, I figured out how to win things. And that at least gave me a kind of simulacra of feeling good. It gave me the feeling of being respected, perhaps, or valued by my parents, my teachers, my peers, whatever, like, hey, there goes a winner. And so I just want to replicate that feeling again and again throughout my life, because I don't have any confidence that just warmth and self-sacrifice and loving vulnerably is going to get me anywhere.

So I'm just going to create companies that like maybe again, provide this sort of simulacrum of human connection. But what it has turned out to be all about is like a replication machine for, know, for the, the leader or leaders of the company to try to 3d print more winning in their life, to try to 3d print more of being impressive to other people, and more money, of course. And it's just very, very scary that what we did in the early years is we converted to their religion. And so now we're stuck in this place where even though it's not a religion that many of us feel particularly comfortable with being believers in, we were in this place where when you speak about that religion, we now have to say we, including ourselves, right?

Justin Beals:  Yeah, yeah, I mean, I participate in it. Yeah.

Greg M. Epstein: Like, I'm not religiously Jewish, but I was raised Jewish. And, you know, in the sense that my parents were both culturally Jewish and their families and their... So like, when you talk about Jewish people, I may not like everything that's been done in the name of my own people, or certainly I may not like all the things that have been done to my people. But when you talk about those people, and I talk about them, I have to use the word we. 

Well, now we all converted so quickly to the tech religion, to the digital religion, that whenever we talk about tech believers and people who are worshipping at these tech altars, like we have to use the word we. I mean, you can't separate yourself from it anymore. We're in too deep.

Justin Beals:  Yeah, I mean, we all work for it in a way. In promoting your book, you had to figure out how to harness it. Yeah, and I certainly have been building a company in the venture capital backed model. You know, our point, our focus is to generate economic value, of course, but I think what in that winning attitude, that constant winning attitude, Greg, that is so frustrating, is that when we only worship winning, we lose sight of the actual value of competition, which is that we have a foundation, like a cultural foundation to work from. And then competition is there to allow us to improve everyone's lives, like together, you know, not only one. 

Like we celebrate the great athletes that has struggled hard and find inspiration in it, especially when they are gracious, you know, at the achievement that community lifted them there. But the tech entrepreneur attitude of I built this is just wrong. I struggle with it.

Greg M. Epstein: Yeah, I mean, it's, so when I talk with my kids, I really enjoy the idea of just telling them like, yeah, you win, I lose. And I like to kind of take a pride, a perverse pride in that, that it's really nice to hear them say like, “I win, I beat you”, and I, you know, having more years of experience, like get to like smile and say like, “you win, I lose”, because it's just so counterintuitive. It's like, there's a real deliciousness to the idea of like, “hell no, I'm not gonna try to win everything in my life”. You know, I am old enough to know that that's toxic, that that is dangerous, that that is, that's, know, the minute I sign up for I need to win everything, I need to score every point, I need to, you know, receive every investment and make every profit, I'm slowly but surely making myself into a super villain. 

And so, you know, the best moments in many ways that I've had in the past several months have been these moments of sort of learning to enjoy the parts of life as, you know, anything that you try to do to advance anything that you're trying to advance. You're going to have lots of moments of failure and loss and whatever. And yeah, to enjoy that, to be like,” uh-huh. Yeah, you know, didn't win that, but had fun doing it”.

And I'm not saying that I've had, you every moment has been like that. You know, I think I'm aware of that because I'm also aware of the pain of like, hey, you know, in my case, it would be like, hey, could that article not have quoted like, hey, you did an article on like religion in the tech world and you know, like, could you have thought to call me, you know, which I've had to think about like something that I've seen in the New York Times or the Atlantic or whatever, you know, from, from time to time.

But it's still like, yeah, it's this feeling of like, whatever it is that you do, you're gonna have moments where you're not going to feel like it's your day. like learning how to take joy in that, learning how to like smile and be like, yeah, that just gave me another opportunity to prove that I'm still alive and I'm worth more than just this victory. is like one of the things that we most desperately all need these days.

Justin Beals:  One of the things that I wanted to catch up with you on because I find the concept of religion in a way to be very dramatic. has over human history, lots of tropes and cycles that we see a lot. And I feel like one of the, the prophets of the tech world, Elon Musk has especially been in this cycle completely the fallen prophet story, should we say. And of course, it's tied up in a lot of drama. But I'm curious about, you know, when your spiritual leaders fail you, so to speak. What's the moment of self-reflection? Where's the guidance opportunity for us to improve?

Greg M. Epstein: You know, what a story that guy is. 

Justin Beals: I know, right? Yeah.

Greg M. Epstein: And, you know, talk about somebody who ultimately has allowed his own life to be swept up in the algorithm of things. You know, that he, you know, he too has just been guided, I think, constantly over decades of his life by, me, please let me do whatever I might do to be maximally a winner.

And of course, the painful irony is that like, you can try to win a specific thing, but if you just want to be a winner at all things, in all places, at all times, in all of life, the more that that's your personality, the more you become what most people would consider to be a giant loser. And so, you know, again, one has to really learn how to embrace losing and be joyful about it.

But anyway, so I do this thing in Tech Agnostic, which you'll recall, but I'll remind listeners of it, that where I have this 10-section big chapter towards the beginning of the book, it's chapter two of the book and it's on tech doctrines, right? Each of the chapters of the book is a comparison between a different aspect of tech and a different aspect of religion, and this is the doctrine meaning like the specific beliefs of the tech world, you know, like you've got your hell, your chosenness, that sort of thing, right?

And the way that I set the chapter up is with this character from Jewish history, whose name is Shabtai Tzvi. And he is essentially, he's a real historical character from the 1600s. And he was the greatest false messiah in recorded history or, since the dawning of the sort of messianic religious age where, you know, you've got your Judeo-Christianity dominated by ideas of a messiah to come and redeem the world and where these religions dominate at least the Western world, right? 

And so you've got this guy, Shabtai Tzvi, who I talk about at the beginning of the doctrine section of my book, where he is able to convince the majority of the Jewish world in the 1600s that he is the Messiah. And you have such a conviction in this. 

He's become so influential that you've got Jewish communities at that point all over the world who are communicating with one another through travel, but also through the use of Hebrew as a universal language across other linguistic boundaries, which there were a lot back then. And he's able to convince people from across the world at that point that he is the Messiah so much so that the most important holiday on the Jewish calendar in terms of religion, spirituality, whatever, is Yom Kippur, where people fast for 24, 25 hours or so, no food, no water for the coming of the Messiah, essentially. And at that time, for a year or two, most Jews stopped fasting and started throwing feasts to celebrate this Messiah.

But then, of course, what happens is  the Sultan of the Ottoman Empire finds out and, you know, sultans don't like it when there's another Messiah besides them or that they don't control. so, you know, calls Shabtai Tzvi in and says, so you're a Messiah, huh? You're actually just going to convert or die. So which one is it going to be? And, you know, he didn't die at that point. So, you you get the idea. So, the point is like, are we having a messianic moment for Musk now where, like, you know, you do enough of sort of empire building, you know, he had this moment. Just this is just a few months ago, right, Justin? Like you believe like how like.

Justin Beals: Just a few months ago.

Greg M. Epstein: Just several weeks ago, I don't know the date, you can find the date and put it on there, but Donald Trump assembles his first cabinet meeting and Musk will not sit down or shut up. And he's taken over the government of the United States of America at that point. I mean, he really had taken it over. The cabinet had been assembled and Donald Trump, the president of the United States himself, could not get Elon Musk to either sit down or shut up. And now as we speak in early June, he's essentially been banished. And he had his kind of, the British comedian Eddie Izzard has this famous routine about cake or death. Would you like cake or death? And he was finally given his cake or death moment.

Would you like to continue being a wannabe trillionaire? In which case you'll need to get them out of here or not. And it's really like the question that I ask in the book is, how does one know that a Messiah is real? Because in a moment when you're engaging with messianic ideas, they'll inevitably be presented in a way that will be convincing to a lot of people, right? I mean, if it's not convincing to a lot of people, then it's just one of those false Messiahs that just doesn't go very far, of which there have been many in history. And they even date back, we see them in the Dead Sea Scrolls which are scrolls that were discovered in caves in Palestine in 1942. 

But we have scrolls from thousands of years ago where there were a lot of other messiahs besides Jesus that people were talking about back then and you just never hear of them. But a lot that you do hear about and it's like, how do you know that that thing is false? Now it's AGI, AGI is coming, the singularity is coming. It's gonna come later this year. It's gonna come next year.

Kevin Roos is writing about it in the New York Times, right? Like, hey, I've been testing this stuff out and you know, maybe I'm wrong, but I think it's gonna be later this year or next year that artificial general intelligence just shows up. And like, okay, Kevin Roos is a New York Times staff writer and I'm not. So, you know, of course he's right, right? But like, is it? How do you know?

Justin Beals:  Well, you know, back to the fracturing of our ability to bring good media together. I think I'll tell you something that I fundamentally think about consciousness and how like human beings engage where we think about AGI is that the large language models that we play with today that seem to know so much and be so real. You know, the thing you call this like has a spirit or we follow or we're putting all these names to, it is a purely systems to feeling thing. It feels that you would like it if it said these types of things. Like if I have to anthropomorphize what is inevitably ones and zeros in a big database. And our interface of language, so much of it is online, has now been hacked by a machine that we can put in the middle.

But there's no intelligence as we would describe it, even from the perspective of like a fly or a dog on the backside. It could act like a chess champion, but in depth, it doesn't carry the same characteristics, because it's the wrong kind of body. Its body of data is to be a chess champion alone. Yeah.

Greg M. Epstein: Yeah. I mean, I think one of the really big problems with this is that along with the algorithmification of our entire society, right? Which is, mean, really, it really is like a massive cult takeover, okay? I'm not fucking around when I say that. You know, like I've been accused a little bit online of like, you know, well, know, by like, there was a cult, you know, a religious scholar who was like, don't use, don't use the, the Kool-Aid drinking the Kool-Aid metaphor because, you know, you have to understand that like the people who drank the Kool-Aid were, that's like a real political and social example. And, you know, there were, there were specific details about that,  you know, aren't, don't translate to what you're talking about. 

So don't use that example because for those of us who actually study what happened when people drank the Kool-Aid in Jonestown, what you're talking about isn't that. It's like, okay, sure. But by that logic, we can't make any historical comparisons whatsoever. I in the sense that I get that there are certain historical comparisons that aren't valid.

But this one, I'm saying like people have come to very strongly held beliefs about what it is to be human and what it is not to be human in a very short period of time. And they've come to those beliefs in a way that has been heavily manipulated by very charismatic leaders with very strong ideological content to their remarks, their whole message despite the fact that they're sort of present themselves as not having such a thing. Like just like a cult leader wants to come across as like just a regular guy, you know, but no, you know, but follow me, follow me, right? You know, don't come out and say, “hey, I'm a cult leader”  because then your people follow, you fire you, right? But if you come across and if they say like, oh, you know, that guy, he's such a regular guy. Like we've got to do what he says because his message is so important. Like that's cult leadership, right?

So, In that kind of society, it has become a lot more rare to have meaningful back and forth regular conversation with other humans in which we do some of the things that are not just core to what humans do when we talk to one another, but best about what humans do when we talk to one another.  So, like, hey, how are you? And when I say how are you, I actually have time and space to hear about that for like an hour or more. Like, not just like, hey, how are you doing, you know, as you pass somebody on the street, right? But like the ability and dedication that people have to like sitting with one another and really talking about how you feel, what your life is like what you want your life to be like, what you fear, what you're angry about, what you're sad about, what brings you joy, what you're wondrous about. We don't have many of these conversations anymore. And so instead, you have these very algorithmic kinds of conversations.

And so therefore, you know, it becomes much more plausible that the interactions that we're having with the digital devices and data centers are real human interactions because we, know at the same time that we're having more and more of those inter, you know, the digital interactions, we're having fewer and fewer human interactions. And so we just, we just don't know what we're missing.

Justin Beals: Yeah, yeah, I certainly like deeply value opportunity to meet with our colleagues. And I do think some of this is post-COVID stress in a way, right? Like we got shattered and we reached out to tech as a method to connect and in between each other was a corporate interlocutor, right?

Someone that designed an algorithm for an economic outcome. I see a place for all these tools, like they can be very powerful. There's a lot of opportunity, but along the way we quit thinking about how we do good with them. We didn't make it a part of our cultural download in the tech industry anymore. I mean, as little as like 18 months ago, you couldn't do a startup unless there was like a cultural good story about how you're gonna make the world a better place. Not just valuation.

Greg M. Epstein: You're saying that from your perspective as somebody that watches this world, that even that, even the pretense has gone by the wayside.

Justin Beals: Yeah, I mean, when Musk or when Zuckerberg got on Rogan's podcast and started talking about how they needed a more masculine company, I was like, how stupid is this? Like what brain programming are you trying to buy into? What happened with just building a great organization that allowed people to talk from around the world that couldn't communicate in the past? That was good enough.

Greg M. Epstein: Yeah, I mean, I think if I could try to translate Zuckerberg there a little bit, there's something about that conversation that's interesting to me. mean, there's very little about Zuckerberg's actual work that's interesting to me. really, don't think that he makes a single product that I'm interested in or that is good for the world. I really don't. But I will say this, like, so in this country and in sort of Western society, Western being a catch-all term that doesn't really mean anything, of course, but, know, in Western, by the way, it implies a kind of God in the sense that like, you know, only if there's like a God, you know, looking at the universe and looking at the earth from the perspective of the sun or wherever, you know, would that divinity decide like, that's the western side of that planet and that's the eastern side, right? 

Justin Beals: So you have to create a perspective, something outside of us that's greater than us, to define something with those boundaries.

Greg M. Epstein:  But anyway, so as a humanist, none of that language makes any sense to me, but it's what we've got to work with. So anyway, so Western society had really screwed up gender very, very badly in the last couple hundred, few hundred years. We created so many hierarchies of power, so many ways of oppressing and being oppressed, that all come down to like, you know, you're a certain kind of man, and so, you know, your role is to either oppress or be oppressed. And it's been, obviously, it's been doom for the people who've been oppressed by those hierarchies. But it's also been really, really lousy and constricting in sort of low-key or harder-to-identify ways, even for the people who've been doing the oppressing. It's been extraordinarily corrosive to their, our sense of ourselves as anything other than machines who are built to be strong and invulnerable and oppressive to others. 

That's how we built our entire sense of ourselves. And so when that's all you've got to be, you know, to have as your sense of yourself, you know, your life is very constricted and limiting and, know, you can't really step out of that box, let you, lest you be labeled, you know, as, you know, in in a homophobic way, in a, a racist way, in a whatever kind of way. Right? And so, you know, in the last few decades, there's actually been some meaningful success in pushing back against those hierarchies in empowering women, in empowering non-white men, genderqueer, gender non-binary people, whatever. And it's been in many ways a real success story. 

But we've managed of course to do it in a way where, you know, we haven't necessarily been so great about envisioning like what does the future look like for the half of the world that is, you know, is going to have to reinvent, you know, right? Like we're empowering women, which is extraordinarily important to do. One of the most important things humans have ever done, you know, is in empowering women to be full equals, to be fully human, right? But like there hasn't, along with that, there hasn't perhaps been as much attention as we could have paid to like what are we gonna do with all these men? Like they can't live how they've been living. So how are they gonna live? And so I have a certain sympathy to conversations about like, okay, well, let's talk about what it is to be a man. Let's talk, you know, let's even market certain things to men, you know, in a way where it's like, okay, here's something that you can consume now or here's something that you can do now. Like, because you're sure as hell not gonna be able to do all the stuff that you used to do because that's mostly not working anymore. 

But to do it in this way that just sort of doubles down on like, no, I'm a man's man and I'm gonna, you know, I'm gonna just eat more of the red meat literally and figuratively that I've always eaten and just show you my muscles and think that that's going to be the way forward. Like, you know, it's like, it's a shame. It's like they're almost having a conversation that they need to be having. But in fact, they seem to be doubling down on the worst parts of it. And so it's a huge missed opportunity and kind of infuriating.

Justin Beals: Because we could be a little more like, I think this idea to your point of like the western idea of we must win everything is a competition with the archetype of gender layered into it is coding the language for you've got to be a winner. Then, along with it comes all the baggage. 

And at the end of the day, I think what you're expressing and I've tried to express is on both sides of the conversation, we need to think about lifting up humanity. Like we're participants together, we need to change how we engage, because it's been wrong. I think we in the tech industry had a big heartache about hiring more diversity on our teams, you know, for a decade. And I, for one, really strove to work hard at it, you know, to create a bigger, broader team. 

But we had challenges where people thought of it as a battleground and not an area to build a new cultural archetype together It was challenging for all of us, you know.

Greg I think the one thing that I'm kind of curious about a little bit is just the reception for the book You know as you've talked to people and met with them How do you feel now being able to share your new cultural archetype? Coming back around. Yeah.

Greg M. Epstein: Have a lot of mixed feelings about it, Justin. I'm really glad I did it. I'm really glad to have written Tech Agnostic, and there have been some really extraordinary conversations because of the book. And, you know, I love talking with you and people like yourself and, you know, being able to dive into what's really wrong so we can try to get more of what could be really right about being human in this kind of digital age.

I would say that the most important conversations that I'm looking forward to having now are probably with religious leaders and communities, but including humanists like myself, you know, I even though I define as non religious, like I'm saying sort of like, I want to convene people, whether traditionally religious of one kind or another Christian, Jewish, Hindu, Muslim, Buddhist, etc. Or sort of coming from more of a moral or spiritual perspective in a community as well, like a humanist or what have you. Who care about what it is to be human and care about coming together with other humans who care about that. To say, look, you know, we're in the midst of a revolution, we're told, but, you know, revolutions are often pretty bloody and sometimes for the good and other times not so much. 

And so, um, what kind of revolution do we want to have? And when can we collectively say enough is enough, enough is too much?  That is the kind of conversation that I feel like I've still, you know, several months in, what are we like seven months after the launch of my book now? You know, I still feel like they're I'm just scratching the surface of what kinds of conversations I could have with people, need to have, want to have. So it's, you know, it's sort of daunting because it's like, yeah, I could have a book and, you know, I spend five, six years writing it and it, you know, wins a few awards and I get to have a lot of cool conversations. like that, I don't know. mean, like I'm kind of meaning-driven. I'm like, you know, like, but what, what is that all for? You know? And, and so I, it just, seems to me like there've been so many scary and uncertain changes in the world in these past five, six, seven, eight, nine, 10 months that I don't know where it's all headed to be writing about and talking about these things, but I do believe that I can have conversations now with people who are sincerely faithful people about what we could call hubris. 

I mean, a technology that is trying to marshal trillions of dollars of resources to do things like end death as we know it, you know, which you'll hear from people like Ray Kurzweil and other singularitarians, you know, these are not uninfluential people. Kurzweil was, you know, among many people will tell you he was a leader or even the leader of the design of Google's Gemini end death as we know it, create trillions of digital beings, annihilate all human beings within this century if it goes wrong. 

That we have to invest everything that we have in these trillions of beings that AI will create in the future. Mark Andreessen will tell you because if we don't, then that is, akin to mass murder. says in the tech techno optimist manifesto, Andreessen does, we believe any deceleration of AI will cost lives. Deaths that were preventable by the AI was prevented from existing that was prevented from existing as a form of murder. And so it's just like, there is, a real fear, I I think, that I have and that I think other people either would admit to or should admit to of like we're on a train that's driving real fast. And if it gets to where it says it's going, then there'll be a lot of gold at the end of that track. 

But we are in serious, serious risk of falling off the track between now and when we get to that pot of gold. And yet not so, but not in the like effective altruist, know, like my paperclip machine will wipe you all out, so give me all of your billions and even trillions so that I can effectively fix it kind of way. No, it's more like, you know, like the train is just going to break down and we're gonna be left in a swamp kind of way.

Justin Beals: Certainly is a lot of swampy information. That's a good description of the information landscape with all these models lately. That quote is so disappointing, and I think it takes us far away from the ethos that I started in this industry with, which was, can we harness this technology to make people's lives better?

Greg M. Epstein: I mean, you did it, but also not. mean, you know, it's like, yeah, the thing can read X-rays better or, you know, imaging better or whatever. It can design a new cancer drug, but it can also design a bioweapon. You know, it can teach people how to build a dirty bomb. can, you know, it's like, the more capable you make it, the more good and more bad that will come of that. You know, both the good and the bad will end up in concentrated communities. We just know that from human history, right? So, like, if you're concentrating more and more of the good stuff in the hands of a certain group of people, you know, you might make those people's lives, at least for a period of time, like, radically better than they've ever been. But at what cost? To whom?

Justin Beals: Well, and they seem to think the impact of their decisions on us compared to the scale of what they can achieve economically is inconsequential.

Greg M. Epstein: Say that again, wait, one more time, I wanna hear that.

Justin Beals: That the, yeah, they seem to believe that the importance of the scale of their economic kingdom is so much more important than the lives of the individuals. their impact of the way they use this technology is inconsequential compared to I need to be that trillionaire. I need to amass this level of wealth and power.

Greg M. Epstein: Yeah, I mean, but these people, they don't really underestimate their own humanity, their own life, right? Like, you know, okay, I get it. William McCaskill, the effective altruist philosopher was, I guess, you know, he was taking a low salary and, you know, he didn't have like a, did he not have a big house? I don't know. 

But that doesn't, it doesn't matter because, you know, him and his friends bought a $20 million castle. And, you know, the effective altruists bought him a $10 million PR campaign for his book. So, you know, that's a lot of benefit to that one individual. A lot of benefit, especially when you think like these people are existing in an ecosystem where the number one currency is being told that you're smart, because that's what you learn as a defense against all this other stuff that comes your way as a human, right? So he got this, know, tens and tens and tens of millions, if not, you know, billions of dollars worth of the currency that is being told how smart you are. But, you know, so it's, it's not like, you know, he's not benefiting from it. It is like a, it is in many cases a straight up like,

I'm going to benefit myself, and I'm not going to really give a shit about what happens to you or I'm going to see you as collateral damage or as a resource or whatever.

Justin Beals:  Yeah. Well, without a great community, we all struggle with good security. And I think that's why we've been so excited to keep talking with you, Greg, because the cultural connectivity that could make life better, us connecting as humans, being able to have more meaningful experiences culturally as humans, no matter what the community I kind of hope for. And I think he'd give us some guidance in our tech industry about how to consider what we should want to build. And I guess I have to hold out hope that change is deeply possible as long as we work towards it. These conversations are a good starting point.


Greg M. Epstein:  Yeah, thanks Justin. mean, the last thing I want to say is, you know, one of the reasons I really enjoyed our first conversation. I realized like, I'm really glad to be on a podcast that allows somebody like me to talk about this stuff, but in the name of security at a kind of awkward time, because I'm about to make a couple announcements that I can't quite break on this podcast about what I'm doing next, where I'm taking the research that I put into Tech Agnostic, I'm fairly excited about. Still, it's just sort of not quite there yet, is, yeah, I think there's something really quite beautiful to the idea of talking about and thinking about and working on security in all the senses of the word, right? So yes.

You know, we live in a digital landscape right now where, you know, I mean, just as locksmiths and police and militaries and, you know, all the, the endeavors that go along with those things have always been an important part of what it is to be human, right? Going back to, I mean, you know,

We've always had military and some sort of policing. I don't know that we had locksmiths for all of time, but even in the Bible, there's actually, I mean, this might be too racy for your audience, Justin, but there's a moment in the Song of Songs, Shir HaShirim, the Song of Solomon, this beautiful kind of sexy love poem in the Hebrew Bible where there's an allusion to intercourse that is, that's where the biblical poet is talking about a door bolt locking and unlocking. like back then they still had like locking mechanisms, they still had security or, you know, the Trojan horse, right? Like these, you know, people have always needed to think about or wanted to think about like external ways to be secure. 

But true security has always been inner security. True security has always been psychological and even what you might call spiritual security. The idea that people who are better at sharing with their neighbors and creating a situation in which their neighbors have comparable life quality to their own, will experience a greater long-term sense of security, right? Where it's like, “oh, I know that person. Their fate is tied in with mine”. “They're not likely to wanna come for me and everything that I have because they need me just as much as I need them”.


And, and, you know,If there are certain numbers of of human brains that are just sort of broken and sociopathic, you know, and somebody decides like they've got no conscience or, know, somebody, somebody's brain evolves in a way that they don't appear to have a conscience in the same way that I do and you do and we do. Well, then I've got a lot of friends from across different sectors of my society who will come together and help me deal with that person. We'll all be in it together, right? That's true security.


So I would just say to your listeners, your community, which I'm sure are thoughtful people, like, you know, most tech people aren't like these sort of crass, manipulative, like, let me win at everything, let me become a trillionaire kinds of people. Like most tech people are just people that get good at a skillset and they go into a field and they get a jo, and now they're doing the job and they're just humans, right? 


But we're talking about ou know, who tends to evolve to be the leaders of these fields that have evolved to be the leading institutions of our entire society. And so I would just say to your audience, what are you doing to be a secure human and to build a multi-stakeholder secure community that will maybe eventually put you out of a security job, but in the meantime will make you a much more trusted authority within your field. And will just give you a sense that the life that you're living as a professional will be a life that's worth living.

Justin Beals: Greg, truly inspiring again. And actually I needed this conversation this Monday. It's been timely for me as a human being, and I'm always grateful to connect with you. Thanks for joining us today on SecureTalk.

Greg M. Epstein: Thank you so much for having me. It's a pleasure.

About our guest

Greg M. Epstein serves as the Humanist Chaplain at Harvard University and also serves the Massachusetts Institute of Technology (MIT) as humanist chaplain. For nearly two decades, he has built a unique career as one of the world’s most prominent humanist chaplains — professionally trained members of the clergy who support the ethical and communal lives of non-religious people.

More recently, Greg’s 2018 move to join MIT, in addition to his work at Harvard, inspired an 18-month residency at the leading Silicon Valley publication TechCrunch, in which he published nearly 40 in-depth pieces exploring the ethics of technologies and companies that are shifting our definition of what it means to be human, often in troubling ways. Greg's next book will expand on this work: "Tech Agnostic: How Technology Became the World's Most Powerful Religion, and Why it Desperately Needs a Reformation," for MIT Press.

In 2005, Greg received ordination as a Humanist Rabbi from the International Institute for Secular Humanistic Judaism. He holds a B.A. (Religion and Chinese) and an M.A. (Judaic Studies) from the University of Michigan, Ann Arbor, and a Masters of Theological Studies from the Harvard Divinity School, and he completed a year-long graduate fellowship at the Hebrew University of Jerusalem.

Justin BealsFounder & CEO Strike Graph

Justin Beals is a serial entrepreneur with expertise in AI, cybersecurity, and governance who is passionate about making arcane cybersecurity standards plain and simple to achieve. He founded Strike Graph in 2020 to eliminate confusion surrounding cybersecurity audit and certification processes by offering an innovative, right-sized solution at a fraction of the time and cost of traditional methods.

Now, as Strike Graph CEO, Justin drives strategic innovation within the company. Based in Seattle, he previously served as the CTO of NextStep and Koru, which won the 2018 Most Impactful Startup award from Wharton People Analytics.

Justin is a board member for the Ada Developers Academy, VALID8 Financial, and Edify Software Consulting. He is the creator of the patented Training, Tracking & Placement System and the author of “Aligning curriculum and evidencing learning effectiveness using semantic mapping of learning assets,” which was published in the International Journal of Emerging Technologies in Learning (iJet). Justin earned a BA from Fort Lewis College.

Keep up to date with Strike Graph.

The security landscape is ever changing. Sign up for our newsletter to make sure you stay abreast of the latest regulations and requirements.