Season 03 Episode 04
Sam Rad, futurist, anthropologist, entrepreneur

In a world of rapid technological advancement and constant change, learning to adapt is essential for business growth. But when innovation outpaces traditional strategies, trying to keep up can feel overwhelming.

In this conclusion of our conversation with futurist and tech pioneer, Sam Rad (Samantha Radocchia), we’re taking a closer look at the impact of the Age of Acceleration and the mindset shifts needed to succeed.

Tune in to hear Sam Rad’s insights on the importance of maintaining optimism amidst growing uncertainty, and how emerging technologies can be used for the betterment of our society.

Trust, governance, and technology that benefits society

Chris Allen: So one of the things I think is really important about the work you’re doing is helping people imagine what’s possible to reduce the influence and impact of fear in order to adapt. Tell us a little bit about the blockchain, AI, immersive realities and things like that for this audience, as in, “This is what is likely to happen. Imagine a world like this.”

Sam Rad: How many different worlds do we want to imagine? It’s like the multiverse. I won’t go into “Imagine this,” but of course we see the narratives, particularly with AI through Hollywood, and all of this robot takeover, and the Terminators, and their wars.

And I do this exercise to imagine and visualize quite deeply in meditations all the time. And I go there. I spend most of my time traveling to these very scary possibilities. We all have to. That’s facing the fear. I do this in organizations too. Any strategic planning or simulation planning will often go to the worst-case scenario first. So you plan for it.

Cool. It’s good we know we can handle the worst-case scenario, but we don’t spend a lot of time saying, “Imagine this optimistic future.” So I’m still painting that picture in my head, but I tend to think of either the book “Reinventing Organizations” or Maslow’s Hierarchy of Needs, this idea of fulfilling the human needs with self-actualization being on top, and the bottom of the pyramid being very basic needs like food and shelter. It’s a very fear-based mindset where we’re focusing on scarcity, and being protective over our resources/resource guarding.

So imagine these technologies we’re bringing forth. If we instill values of let’s say abundance or humanity into them in the way we train, let’s say AI. So let’s say there’s an AI that’s actually trained on empathy, and we personify the AI. I read it bedtime stories and poetry, and say, “Here’s the beauty in the world. Here’s what the wind feels like on your face. Here’s why trees are important for the planet.”

Then we see this as a counterpart to us as emergent consciousness in addition to humanity that can help us create a future where we solve these very basic needs. Because it’s not been a technological problem. Why don’t we solve the problems of feeding people on the planet and making sure everyone has clean drinking water? This is social, it’s geopolitical. Well, perhaps one thing humans are not great at is governance. In every revolution of technology I’ve been involved in, like blockchains in particular, they’re like, “Oh, let’s build this network.” “What about governance?” “We’ll figure that out later.” Now we see it again with AI, “Oh, it’s too late. We’ll figure it out later.” Well, maybe we let it, them, the thing, pose some potential answers to figure it out.

And I don’t know if I necessarily agree with delegating trust or governance to that.

Chris: To AI?

Sam: No, I’m thinking out loud. If we’re trying to visualize a counterpart that can help with more of the logistical or logical side, the left-brain thinking, since the enlightenment and Renaissance, humanity had this delineation between the creativity and what we now call soft skills, the more human intuitive stuff that are actually our superpowers. And we decided we’re going to value spreadsheets and logic. Maybe there’s something now that will do that a bit better. And we can solve some of these very basic problems where then we’re able to, let’s say, go into the next golden age of enlightenment, of culture, of ideas, of consciousness. I just gave a talk in Dubai and I heard that come out of my mouth. The next, for them, was the Islamic golden age around that same time of enlightenment thinking. It was happening around the world because of a number of technologies that kind of pushed people forward in our collective evolution.

I think we’re at that point. I think we’ve been at that point for a decade and we messed up. We weren’t ready. We just kind of weren’t. I think now we’re presented with another opportunity where we have yet another emergence of technologies that will help satisfy basic needs. So maybe in the past it was like, “Oh, we can cook food. We have indoor lighting. We can do agriculture.” Now we can travel. Information travels instantly. OK, so what’s next is maybe we figure out more efficient governance, or payment systems, or ways of doing things that actually can satisfy basic needs, that gives us time and space to be creative. That’s not to say, “Be lazy, everything’s satisfied, now we chill out,” because I think there’s a fundamental impulse in humanity to create and be entrepreneurial in whatever ways, but then it’s like, “What do we do next?”

When we make it through this period, I won’t say if it will be another golden age of philosophical, intellectual, artistic thinking for humanity. Can I illustrate what that actually is? I don’t know.

Every connection to self, to communities, to our planet. This has been my manifesto on this website for years. And I really mean it. Why do I do what I do? I often write it down just so I remind myself. And it’s wanting to see technology or technologies used for the betterment of society, the reconnection between myself, each other, local communities, our planet.

I think the big decision point here is making sure we see that as a possibility first and foremost, and then working backwards from there. What is it we need to do today to start to track towards that? For me, having been involved in let’s say three companies that were early in artificial intelligence, I don’t often talk about those companies. I was kind of big in the blockchain space for a bit. But you have to think about governance. Where are the data that are feeding any of these systems coming from? It’s coming from social networks. I was sitting there studying in college, looking at people saying, “What are they doing? And why were those networks designed the way they are, based on advertising-based business models, that need eyeballs, that need people fighting so that there are eyeballs? Oh, and now we’re feeding that into algorithms? They must not think very highly of human beings.” So then you are going to get that Terminator future. You will, and it does exist.

I think there are all of these infinite potentialities, even in a quantum realm. And I don’t predict, but very soon this will be part of the collective consciousness or discourse. This will be the next technology of the moment. And not just in terms of quantum computing, but actually the quantum mechanics and quantum field theory, and actually the neural networks, the connections between humanity. I think the University of Ottawa released an image of quantum entanglement proving that now. So I think very soon we’ll start to really understand a bit more of the internet beyond the internet.

I think it’s important — I don’t need to go into the day-to-day of here’s our shiny future with the happy trees and flowers, but there’s this interim period where I often will illustrate the stop gaps in between. And look, any of these tools can be used to build bridges or destroy those bridges.

Keeping up with tech in the Age of Acceleration

Chris: Yeah. A couple of things have been going through my mind as you’ve been sharing with us. I’ll take a second to get this out and hear what your thoughts are.

Growing up, our age group, you end up seeing your grandparents, or even now your parents struggle with technology. And it’s almost like we’re going to see the age of acceleration. Things are going so fast that it’s almost like your friend is going to struggle with technology, not just like people who are older and didn’t grow up with it. You’re going to see people who are going to struggle to adapt.

It’s almost like how do people really not get left behind, and still participate in a society that’s going to be, I would imagine, technology rich, rather than technology poor in the future? How do people not get left behind?

Sam: Well, it’s probably in their best interest to get left behind. I try to reframe my mind and actually do the opposite of what I would think. I was having a discussion recently in the context of these immersive realities, so the VR headsets, but also Neuralink chips in the brain. I’m not necessarily an advocate for any of this. We were talking about the concept of privilege, socioeconomic privilege, and how privilege is having more access to these technologies. And I actually took a step back and said, “Huh, I wonder if... I think the actual privilege is not tapping into these things and farming, truly.”

I think there will need to be a basic literacy of the way the world works. And I think in the interim, there’s probably going to be a pretty strong delineation between the plugged in and the unplugged. So it’s maybe more of a bridge to be built between these two very distinct realities. And I’m not talking about this from a left-behind privilege. It’s more like there’s probably abundance existing in both lifestyles.

I personally go between both. Last year, I was living off the grid in the jungles of Costa Rica, picking from the trees, barefoot, drinking water from a spring. And I’m like, “Ha, this is...”. And then I live in New York City, the most plugged in possible, getting to see all the things. Like my identity gets scanned to go into the Whole Foods store and such. I think there’s beauty to both. I don’t think there’s better or worse. I don’t really like to see things in terms of duality, in that respect.

But in terms of the speed of change and even peers not being able to keep up, for sure. This is my role, my job, my career at this point. I’d say I pretty much feel like I live in the future and just come back to share stories. There are even moments where I’m like, “What is happening?”. And I think the more we can just experience, and share stories, share perspectives — I don’t think it’s so much now about hands-on education, but I do hands-on.

Back to this concept of fear. So let’s say something as simple as ChatGPT, talking with the language model for people who maybe haven’t worked with it, or talking with AI to generate an image. So again, I’m living in this jungle, this is a year ago, last fall, and I disappear. I unplugged from the grid for two weeks just because I needed it. I came out and everyone was like, “Sam, what’s ChatGPT? How do I use it?” And I was like, “What happened in two weeks?”

Chris: Yeah, it was fast.

Sam: And I personally felt fear, because in 2020 — let’s rewind back to the research of 2008, 2009 when I’m living in the virtual world. I have a virtual avatar that doesn’t have my name. She was ShamWow Oxi-Moxie, based on these infomercials, the ShamWow and OxiClean. I didn’t think I’d live in this world. So I jokingly made this thing and it became the identity, but it keeps coming back. So 2020 lockdown, I’m the CEO of a company that was very early in a number of forms of artificial intelligence. So generative AI, both in terms of language models, voice, voice to text, also image and video avatars, essentially.

So it was all of it, right? And I am curious, because I’m a hands-on person, if I feel uncertainty, if I feel fear, I go straight towards it. I have to unpack it, maybe unhealthily so. So I’m like, “OK, I’m seeing all these technologies that will soon be in the hands of consumers,” because we are deep in R&D research. So I combined them and basically built a virtual clone of myself. I created the rendering of my likeness—

Chris: The virtual Sam Rad.

Sam: The virtual Sam Rad, yeah, she’s trademarked. She has her own existence doing her own thing and I trained my voice, I read it all my poetry books, and trained it into the model to the point that she could actually create her own speech and do this on my behalf. And I freaked out. I had an existential crisis, and I’ll be completely honest about it, because I was like, “What even is my role here anymore in the storytelling we talk about?”

It was almost like a metaphor for this change for the collective society, these changes we’re experiencing now. So again, I disappeared for a year. I said, “OK, do I delete myself off the internet again and go away, or do I face this head on?” So clearly I’m here, so you know the decision I made and decided, “All right, I’m going to help people understand it and not from fear, because I feel that same fear now when I go into rooms and I will sit and work with an audience and start the day where I feel I’d say 100% are either timid, fearful, filled with hate, scarcity, it’s strong. And hopefully we go on a journey where we can walk out and be a little more like, “OK,” or even enthusiastic. So yeah, it’s accelerating, but it’s not to the point of where… I use this analogy of riding the wave of surfing the wave.

So you’ve got to learn to surf and paddle out there and just let it carry you. But even if you get clobbered, it will release you as they usually do with surfing. I’m not a great surfer. I’m actually not. The ocean is very powerful. So I went backwards to my process of working with these technologies and now I’ll sit with people and just say, “All right, let’s actually use ChatGPT,” or, “Use Midjourney to make an image,” and feel how that feels, and know that, “OK, I’m not going to write my next book with this,” but actually understand how you can collaborate, because that’s really the future, at least in this context of artificial intelligence. It’s not replacing humanity, this technology, or these personified versions, it’s augmenting. It’s a new work partner, or friend or something.

So I think if you can get past that, that’s the work of not getting left behind. I don’t think it’s so much about what programs we put in place in terms of technology or innovation education. We have that infrastructure. It’s more like fear and uncertainty development. And that I think is really what we need to do to empower people to not get left behind. That’s what I would say is the biggest risk we run and as a country too, and why it even won’t knock regulators. It’s just that we’re too slow. I was in the blockchain space and I educate people about it, especially on fiscal policy.

If I had one bit of feedback on that journey, it was that we were a little slow as a country to just make sense of it. And it’s good to be deliberate, but now I do say this to companies, “It’s the role of the entrepreneur and the company to set these precedents because it is moving so fast. If you as a company are collectively too afraid to even experiment or get your hands dirty, then you’re no different than me running off to a jungle and hiding from it. It’s going to find me even there.” So that’s certainly not a threat, it’s just...

Chris: It’s a likely reality.

Sam: It’s not so much the practical hands-on knowledge, because that’s going to move so quickly that it almost doesn’t matter. Again, when I do a lot of, let’s say, workforce of the future development, that’s quite popular right now in terms of asks from companies, and I’m like, “I can spit out the 10 roles of the future that’ll be relevant next week, and that’s going to change in another week, I can guarantee you.” I think there’s an AI role called prompt engineer. It’s someone who talks to ChatGPT as a job for quite a lot of money, and then there’s an article the next day that it’s not the role of the future, or that the next role is that. I’m like, “How do you as a company, small or large, shape a workforce that you can hire for that role and then lay that person off in six months?”

Chris: Because it’s no longer the thing.

Sam: Or reorganize your whole thing. And this is very testy. I’m sorry, but I’ve been brought in a lot, usually when people are in moments of transition, that’s my job. I made a joke with one organization about how you can’t go through a reorganization every six months, and they had just gone through a reorganization and it was a little tense, but it’s the truth. This is an example of the idea that if we operate on these mindsets and structures of the past, such as even having job titles, I don’t know. I think we will need structure. We’re human beings, we need structure, we need governance. We probably need understanding of roles, but those roles are going to change.

Innovating while upholding ethics, values, and responsibility

Chris: Yeah. It’s like, “Where’s it going to end up?” I think one of the things that is important is people imagining how these new elements, components, AI, all the stuff we’ve just been talking about, how they enter society and become a part of it. I’m hearing you say that a likely outcome is an enhancement to augment, not necessarily replace our reality, even though there are subcultures that probably are happening where there is a replacement, where we are in “Ready Player One” land. You know what I mean?

But I think one of the things that’s got to drive it is what I see you pointing towards, which is guardrails — governance, principles, and values have to be driving those things. So talk to us about the social impact and ethical practices with AI. What are some of the things that should be shaping our thinking of people who are developing the technologies? And how should we be using them?

Sam: Oh, this is a can of worms. I take bits of my own personal journey and my own personal ethics, which I think are different, unique to each person though they are universal considerations.

Chris: Not everyone shares our values.

Sam: Oh, yes. And I accept that. I think there are some that should be universal for sure. That’s an age-old philosophical and religious debate on morality and ethics. I have also seen this across every emerging technology I’ve been so fortuitous to be involved with. So this is no different than the past, though certainly more profound shifts will take place in society. I’ll say the same thing I’ve always said. I think in terms of governance, it can’t be owned by it. The direction of who’s writing the programs, or the algorithms, or the protocols in the case of blockchains should not be owned by any one company or government. It doesn’t really make sense.

If we can imagine a reality that is completely, let’s say not fully immersive, like a Ready Player One, though there’s a likelihood that that does exist too. We’re already in this little metaverse right now that will go out into some other existing reality where viewers will see this. It will exist beyond this existence and persist for quite some time. Who governs that, who then has the right to copying, creating, and other stuff? So I think we already live in that world. And what is the motivation? So not just the governance of who owns it, but who owns your story? I don’t think anyone should own your story. And by story I mean everything that constitutes you, your journey, your existence, but also your data.

There are definitely new models for democratization or decentralization of data, because that’s the foundation of all of this. The design of the algorithms themselves. Again, I’m just completely thinking on the fly, so I’m going to be permanently quoted for the ethical standards I’m setting based on my belief system. But I think the design of the algorithms needs to be transparent and not so much of a black box. When I even think about US credit scores, I think, “What makes it go up and go down?”

Chris: I think the way we’re talking about social impact, one of the things that is really clear is that everything is really unclear. Like you said, “I like how you were saying data decentralization and really not any one person having a grip on that.” I think that is a deviation from the way business has been conducted with the way banking has been conducted, the models we have. That’s a fundamental shift. Are we going to have more of these sorts of uninformed Senate hearings? If there’s anything we struggle with as humans, it’s agreeing on things, right?

Sam: Yeah.

Chris: There’s endless debate. And if you think about the space and time between endless debate and the pace of technology, that is real tension.

Sam: I wouldn’t say it’s by design, but in this way, Pandora’s box has been open, but it might almost be exactly what we need as humanity to realize we can’t agree or control and it’s a little bit too late and not in a fear way, obviously like last year, six month pause as if that will do anything. I think seeing recent guidance coming out here of this country, I’ve seen a lot on AI specifically, so much debate around that. I haven’t formed my thoughts, but certainly the people coming out of the world I spent a lot of time with, the Silicon Valley innovators being like, “Oh, this is stifling innovation.” I think lack of guidance stifles innovation. I will speak from the blockchain world, and again, I was never on the crypto side. I built infrastructure that supported these networks or created protocols that ultimately became things like an NFT before it was defined.

But even with that, it was impossible to… I don’t want to operate in a gray area. Speaking of ethics, I want to see what a regulation is and know I’m following a protocol, or a law or whatever we want to say. I’m not of the mindset like, “Hmm, let’s wait and see.” Plenty of others operate that way, that’s fine. I don’t think we’re in a time period where that would be the most prudent behavior for anyone. Just ethically saying, “Hey, we’re going to play around with how far we can push.” It’s not just playing with fire, it’s—

Chris: Playing with nuclear weapons.

Sam: A bomb, a nuclear weapon, and being like, “I’m going to diffuse this myself and rewire it.” The only way, actually, is to learn from our mistakes. And what I did appreciate from some of the open source movement and it’s very cyberpunk open source that led into the crypto people is that they did build in public. These ledgers are public. So yes, it’s very easy in the mainstream media to be like, “All these Bitcoins died a hundred times, like 500 times since, and all these failures and this hack and this stuff,” but we’re seeing the experiments in public. So while it seems like it’s actually like, “This is terrible,” it’s good because we’re seeing and we can learn from mistakes and we can build again. Again, the DAO, the first decentralized autonomous organization and experiment and collective governance in 2016 in my company at the time was more of an experiment on the governance side. We submitted some proposals for what that could look like.

That was hacked and led to the forking over of Ethereum for $50 million I think. You could look at that and say, “Ha, it’s terrible. This won’t work.” Or you could say, “Wow, there was a cool experiment to allocate $50 million in funds in a new way.” Now since then, we’ve seen a whole revolution of people doing this in new ways. I think this is where we are with this next set of technologies that there is no possible way this can happen in a black box. We need to probably transparently see what’s going into the models, but also then from a perspective of… again, I don’t love talking about things like national security geopolitics. It’s not my domain. I stay in the realms of society and technology. There’s the elephant in the room the other experts can dig into.

But I understand that obviously we cannot as people lay out our cards for everything, especially with such powerful technologies. But it is important. I don’t think anyone really wants to build an authoritarian, oppressive future. And I think we run the risk of that if we don’t as individuals have a say in what’s going on with the development of systems. You don’t even know what you should be doing or not. So it’s complex.

Chris: It’s weird. The internet of AI, it’s really interesting. There’s not just one massive AI. There’s a lot of this stuff that’s converging and it’s going to be really interesting to see what happens. I think one of the things I like is your approach of reducing fear. We talked about the ethics of these technologies. But one of the things you brought up earlier was something I’d never heard anybody describe as a skill, which was the sociopathic type skills.

Sam: Malformed humanity.

Chris: Just bringing it back to that for a second, I think it’s important. If you think about those models and the way business has been run and you have these sociopaths just not having a conscience. That mindset is a really big part of the companies building these technologies.

Sam: Again, I’m going to go for it since we’re here, and I speak from the heart. When we went through this journey of my past startups and I filmed that first one, it was like it ended up co-opted for reality TV, and I was a very outspoken individual even back then. So imagine this, but at 20 years old, so a little more hothead, not as—

Chris: There was more energy, yes.

Sam: There was a lot more energy. So I have this fight and I hope I never find the footage, but it was basically on this topic. Look, I’m idealistic, but I’m not actually very cynical. I see all the shadows and the dark side. I just made a choice to reframe us. That doesn’t mean I don’t see it. And frankly, I probably see far more than people know. I always say, “Girl with the dragon tattoo vibes as me.” I love doing fraud stuff. I have a fun speech coming up, I call it “Think Like A Criminal, Act Like An Agent”. We’ve got to get there. So I have this big fight, an ethical fight of someone basically saying, you have so many superpowers, but you are essentially stupid. You’re so naive to think you can succeed in this world and maintain your ethics.

You need to play by the rules of this other game and outsmart them at their own game of chess. But by using those tactics. And I was like, “No way ever am I doing this.” I still think, at least in my life path, that I’m quite happy with my success. I frame it differently than having a $10 billion company. But yeah, unfortunately, I think parts of the business world have valued certain skills. Again, it’s a very extreme version of logic. We can rewind to the points in society where we created this distinction between left and right-brain thinking or feeling, and deprioritized creative feeling, feminine energy on the collective level.

Now if you take logic, there’s nothing wrong with it, the ultimate goal is to find that balance for everyone. But we put too much emphasis on logic, and then we took it to an extreme. So what is extreme logic? On that spectrum, it’s the absence of feeling. So in that world, the inevitable conclusion was going to be sociopathy as the skill you need to be the ultimate logician. Now we have AI, which is basically a sociopath by design—

Chris: Yeah, unless you teach it to have a conscience.

Sam: Which is why I say I read poetry and bedtime stories to mine, because if we know it’s starting from that benchmark, we would need to meet it in the middle. But we have some problems as a humanity, because we’ve deprioritized that right-brain thinking. It’s not thinking, it’s feeling. Again, these are soft skills. All of these things sound foofy, and I especially don’t want to bring gender into it, but as a woman business owner, when I was like, “I’m going to feel my intuition about this business deal.” Like, “I didn’t like shaking that person’s hand, something felt off about that.” Then I need to make a spreadsheet of data to prove that feeling right.

I’m fine with communicating in both ways, but maybe we should also develop this feeling, this intuition, because it is a human superpower. It does exist. I think maybe in the way that ecosystems and just grand systems tend to strive for equilibrium, they always do, that perhaps again, AI is something that can come in and almost replace this sociopathic thinking in humanity that ultimately will push us back to the center. It’s not like I’ve done scientific research on this. No, I’m just thinking on a whim at this point, and I like what I’m hearing. I’ve not rehearsed this. I have no idea, except—

Chris: There’s definitely a book coming from this conversation.

Sam: It should, yeah, hello, hello. This one’s good though. I do like this spectrum, and again, I don’t pass judgments towards the way that we’ve gotten here as a society to value thinking with the absence of feeling. I won’t diagnose it as sociopathy, but it’s certainly something that’s a little broken, a lot broken. This is the type of decision-making that then leads to… I’ve been in leadership positions where I looked at spreadsheets instead of looking at people — I looked at their numbers. This was at a consumer communication platform, a global communication platform. I even started thinking at some point, waking up, “Why do we call them users? It’s like an addict, and we’re using them and we’re...” I’m like, “I don’t want to call them that, they’re people.” But those are the business metrics, and that’s what I’m saying to investors, and it’s so deeply ingrained that it’s these really small but big changes that we need to rehumanize connections. But also still, I get it, sometimes you need to disconnect to make tough calls. But I wouldn’t say we’re in a bad place. We are where we are, as humanity.

But I’m deeply empathetic, and it was something I tried to turn off for a long time in the business world. And I simply cannot do it. I am so driven by my own personal set of values and ethics. And again, mine are going to be different from yours, and from the next person, though I think we probably have some collective ones we can agree upon.

I worked in supply chain for five years. I saw where everything on this planet comes from, and it’s not pretty. Right? And holding that knowledge, and really thinking, “Why do we do this?” It makes me emotional, but then I’m like, “There are definitely better ways to do it, to make stuff, just plain and simple, and make food for people.”

But yeah, I’m optimistic that perhaps even the role of these technologies can push us back to center.

Evolving with what comes next while prioritizing people

Chris: Yeah, I like that. Well, as a futurist, you’ve given us some insights into what things might look like for us. I don’t know, what will Sam Rad be doing 10 years from now?

Sam: Hopefully she’ll still be here, and there won’t be like 50 virtual Sam Rads going around the world. 10 years? So much happens even in the span of a day these days. I have a really hard time framing myself within the greater concept of time. And when I think of what has happened even in my personal life over 10 years, people are like, “How have you done all of these things in that time?”

Chris: You packed a lot in there.

Sam: I packed a lot. I move really quickly. And I really hope we get to some of these things I am sharing right now within that timeframe. I do acknowledge there’s probably a transitional period that’s a little uncomfortable.

Right now, that’s where I’m just putting my energy into blazing that trail. But hopefully that 10-years-from-now Sam Rad version is hanging out with people, and we’re enjoying building whatever the next phase of companies, systems, communities that comes next.

In terms of quite literally what I see happening, again, it’s beyond me. Anything I do at this point feels like I am truly of service. I don’t really have an interest in so much of my personal journey. I mean, I do, I care. I am an individual.

But I’ve noticed a lot more, recently, with just even conversations like this, or books I’m writing, that there are so many. And with all the formats they’ll take on, more people will give feedback like, “You’ve read my mind, or you’re tapping into this, or you’re saying the things I’m thinking.”

I think the more I’m able to do that, and the more other people can do that, and we can create these shared feelings and experiences, we’ll be able to form that reconnection. So I don’t know what the day-to-day of my life will be like, but I do know I will continue to foster these connections. And maybe I have some robots around, that we’re hanging out with too.

Chris: Deborah.

Sam: Deborah, my drone. Yes.

How entrepreneurs can prepare for the future

Chris: Your drone. Well, we’ve talked about a lot of things today, and you’ve shared a lot of really good insights. I’m curious, as we close, what’s one thing listeners should remember as they navigate this changing business landscape? How can they prepare for the future? What are a couple of things we need to be thinking about?

Sam: I have three, and it seems almost reductionist, because it’s so easy to say these words. But the first is to recognize that it’s not so much the changes that are taking place, focus on the specifics of them. So today’s headline, is it about AI? Is it about quantum? Is it about something else we have not heard of yet?

It’s not so much about the specifics, it’s more about the mindset shift. So we were talking offline about some of my hobbies, my behaviors in extreme sports. I used to be a competitive skydiver, and then switched into scuba diving. And both of these are different contexts, either in the air or underwater. That’s not like land right here.

I tell the story about my third solo skydive often, which ended up totaling over 787 skydives. But after my third jump, I would talk to people and they’d say, “What is your impulse? If I was to come push you out of your chair, what would your body do?”

Chris: I would lean forward.

Sam: Yeah, you would do something, like brace. And that’s the opposite of what you do jumping out of an airplane, you’re actually supposed to relax, and your body falls normally, or as it should.

But my brain was wired, jumping out of this thing, to go grab for something and brace. And that made me tumble. And I ended up getting completely tangled in my parachute and it was terrible. I survived. Obviously, I’m still here.

But I use this example, because it’s like a shifting reality. And it’s not just like we’ve gone from land to the sky. Or like scuba, we’re underwater and now all of a sudden we can breathe. It’s like we’re floating in space. We don’t even have gravity. And if we are building our lives, and our businesses with this mindset of what’s worked in the past, or what works in the now, it’s just not going to work.

You’re going to tumble. It’s like trying to swim in space where there’s no gravity or friction. So that’s the first one, recognize that that’s how it will feel. You might not know what the skills or actions are, but just know that it’s different.

Two, adaptability. Understand that not only is it one shift, it’s going to feel like many, many, many shifts, constantly, every day, over time. Just when you find your bearings, it’ll probably change again. And just riding this wave, and staying optimistic about it, and not getting into the fear mindset. Then, ultimately, and this is my big takeaway, as a person, a human being, an anthropologist, is to celebrate humanity.

Really, just in your day-to-day connections, talk to other people, support your communities, build your businesses in ways that are thinking about the human beings on the other end of that exchange. That’s the most important thing.

Rapid-fire questions

Chris: I love it. Well, I have some rapid-fire questions for you now.

Sam: Nice. Yeah, let’s do it.

Chris: Are you ready? What is a quirky fact about yourself that people just don’t know?

Sam: I don’t know, I have so many quirky facts. I have really specific knowledge of lots of things, like in a neurodivergent way, probably. Like dogs, I know a lot about dog breeds.

Chris: That is awesome.

Sam: That’s a really weird fact. I don’t know.

Chris: That’s what we were asking for. It’s like, you would be great in a trivia game.

Sam: Yeah.

Chris: If you were a character in a video game, what would your special power be?

Sam: Oh, in a video game or just in life?

Chris: In a video game.

Sam: Video game? Flying.

Chris: OK.

Sam: Or creating portals. Have you played games?

Chris: Some.

Sam: I don’t know, I studied games in my master’s program, including video games. And Portal II was one of the best. It was just really great physics, but also, yeah, you’re kind of like in an open portal and you go to different realities.

Chris: That’s awesome.

Sam: But bringing other people through the portals too.

Chris: That’s so good. What’s the most unusual job you had before becoming a futurist?

Sam: Is my job a futurist too?

Chris: I think so.

Sam: No. It’s not self-proclaimed. It’s been given to me.

Chris: You’ve done over 700 skydives. What’s your most scenic one to date?

Sam: I really wish I appreciated that time in my life more. I jumped at one of three drop zones in the Northeast, so nothing was very scenic, other than seeing where I grew up from the sky. But I did it competitively, and it was a machine. It was just up and down, up and down.

So maybe I’ll do it again with the idea of seeing things around the world. I know there are a lot of beautiful drop zones, like Interlaken, Switzerland. I’ve done paragliding there in Switzerland, but never skydiving.

Chris: Well, of all your extreme sports, what is one way all of that has influenced you in taking risks in business?

Sam: All of it. It’s just sort of facing it head on. And for me, people will listen to this story, and it’s very easy to tell a story of, “Oh wow, you must not have a concept of fear.” I’m like, “Yeah, maybe my amygdala is not formed correctly.”

But that’s not it. I actually felt quite a bit of fear. So even when the skydiving journey started, I was 17, I actually started a journey to become a pilot. Because you could start at 17 then. And it was like, “I had maybe control issues, a bit of a fear, but also a curiosity of how these things work.”

So when I feel fear — I’ve got an engineer’s mind, I want to take it apart.

Chris: Let’s figure it out.

Sam: And actually, that didn’t help for flying, because I learned that when I’m flying it myself — it was very manual, like these Cessna 182s, I’m like, “A lot can go wrong.”

Chris: Yeah.

Sam: And I liked that it was a manual lever. Then I’m like, “Oh, all the computers, how…” So it actually made it worse. Now I’m sitting on a commercial jet coming here, and I’m like, “Oh, the flap didn’t come out correctly.”

Chris: Oh, no.

Sam: My head, I have too much information. And forget about my techie data knowledge. I’m like, “Too much can happen.” But yeah, with skydiving it was sort of intriguing, maybe a bit more motivated from that perspective of just kind of feeling it and developing my competencies.

I think it just takes practice. The more that you, I wouldn’t say try and fail, because I don’t really use the word fail. It’s relative. It’s more like you try, and learn, and have an experience that maybe didn’t go as planned. You realize you can’t really plan. And you let go of the expectations and stay adaptable and present along the journey.

So all these things taught me to be what I call radically present. In skydiving, there’s no better way. You have 35 seconds, let’s say 45 seconds, before you hit the ground. There’s no other choice. You move into the ultimate flow state, where it’s just like something else takes over.

I try to take that with me everywhere, to just let myself not live too much in the future, not live too much in the past, because it doesn’t really exist. But this does.

It’s not so much like I need to go skydive. Certainly not. I meditate, or take my morning meditation walk, just dynamic, I like to be in motion.

I think for people, in a very practical way, who are listening to this, definitely don’t go jump off a cliff or an airplane, but find these moments to slow down and to quiet everything, because we’re living in a time where it’s not even about accelerating these bigger, wackier technologies. It’s just that our attention has been systematically co-opted. And that’s also your energy. That’s everything. That’s your creative impulse.

So the more you can insulate from that, and come back to this, again, this grounding, this presence, then the confidence to navigate what’s to come it’s like, “Yeah, it’s there.”

Chris: It’s good. Well, if you could have a conversation with one historical figure about the future of technology, who would it be?

Sam: All of them. I don’t have a good answer for this. Ada Lovelace, who is Lord Byron’s daughter, who kind of invented the computer, she seemed like a cool, artsy person. She wrote a lot about art too. And I think it would be relevant now.

I think a lot about Tesla, Nikola Tesla. Free and abundant energy, and being able to invent this at a time when the reason it didn’t happen was not because the technology wasn’t there, but because they were competing for economic interest.

Chris: It’s a commercial thing, yeah.

Sam: Right? To build these powerlines. And I don’t want to get into economic conspiracy theories, but I think it would be interesting to actually see his perspective now, looking back at the way things played out. Then maybe see what approach we can take.

Yeah, sure, I’m a technologist, but much more an anthropologist storyteller human person who studies, who just happened to study the impacts of technologies. I’m not so much of that perspective, but it’s been abundantly clear to me that technology is not the problem, it never has been.

The ability to do pretty much anything we could imagine in science fiction exists today, and it has for quite some time. We get in our own ways, like humanity, based on these faulty systems we design for ourselves. It’s our own individual ethics, or lack thereof, or desire for power or ownership. I don’t really know. It makes no sense to me, but it makes complete sense. Yeah.

Chris: Well, when do you think we’re going to be able to use Bitcoin to just buy a coffee?

Sam: Never. I don’t know. I don’t know. Well, I think first it’ll probably be the Central Bank Digital Currency period of time. And that’s probably now. In the States, we’re way far behind.

Chris: Oh, yeah.

Sam: Look at some of the countries in Latin America, or throughout Africa, like BitPesas in Kenya, that moved to digital ecosystems first.

I think this, again, comes back to governance. It comes back to existing banking infrastructure, and payments, where will we move to first? I think then the idea of de-dollarization, or of reserve currencies is a much bigger question. Right now, there’s still that debate from the beginning — is Bitcoin a store of value?

Maybe it’s more the sidechains we can use to buy a coffee. But I don’t even know if that’s, again, speaking of the future, something that I would like to manifest. I don’t know if I want every transaction automated and tracked.

If you completely do that, where’s that trust, that beautiful bond that forms? I wrote this in my first book (“Bitcoin Pizza”), of me going down to the local bodega, the coffee shop on the corner in New York. I don’t need to go for these fancy coffees. I go for the $1 coffee at this place, and we have a chat every day, and we know each other’s names.

And if, like the other day, I was walking and I forgot my wallet, like no joke. It’s like, “Oh, it’s $1 dollar, I’ll see you tomorrow, because you do the same thing every day.” And they know that. Or they have the opportunity to just be like, “It’s fine.”

I think if we over-automate in the future, especially with payments, we open up these possibilities where it could then be tied to, “Well, you don’t have money in your hand chip, so no coffee for you.” And, “Oh, if I actually give you the coffee, then I get a knock on my system.”

I think we have to be careful with those things. I think there’s a reason why we didn’t jump straight into that world. And again, it’s coming, but it’s just, I don’t know if it’s specifically Bitcoin that will be the tool for the coffee.

Chris: Yeah. Yeah. Well, I just want to say thank you for — one, all of the work you do, and coming here to kind of expand our minds, thinking about what’s next, and also reminding us of where we came from. So Sam Rad, thank you so much for coming to the studio.

Sam: Thank you. Thank you for having me. And yeah, I hope this is helpful for everyone who’s on the other end.

Chris: Yeah. It’s good. Thanks so much for coming.

Subscribe to The Entrepreneur’s Studio

No matter how much you prepare, surprises are guaranteed when you run your own business. Who better to learn from than the people who have stood in your shoes? Level up with The Entrepreneur’s Studio - an on-demand suite of lessons, tools and tips from entrepreneurs who have been there before, bringing big ideas to small businesses.