A conversation with Shawn Yeager and Derek Ross at the Imagine If conference in Nashville, TN. We discuss the current state of digital communications and identity with a specific focus on nostr.
Date: September 20, 2025
Shawn on Nostr: https://primal.net/shawn
Derek on Nostr: https://primal.net/derekross
Bitcoin Park on Nostr: https://primal.net/park
Imagine If: https://bitcoinpark.com/imagineif
(00:40) Future of digital comms, identity, social
(01:41) Diagnosing the problems: incentives, KYC, and broken trust
(03:05) Censorship, shadow bans, and owning your social graph
(05:00) AI, deepfakes, and can we trust what we see?
(07:24) Algorithmic control vs user choice
(10:10) Introducing Nostr: open protocol, healthier engagement
(11:54) Digital health, doomscrolling, and parenting challenges
(15:21) Youth safety on open protocols: responsibilities and tools
(18:22) Give parents the tools: OS-level controls and UX
(19:35) Getting started with Nostr: keys, Primal, and UX spectrum
(21:17) Vibe-coding apps: Soapbox/Shakespeare on Nostr + Bitcoin
(22:39) Permissionless payments and AI-built sites
Video: https://primal.net/e/nevent1qqs0v7rgjh55wygwuc8pmqvk0qz6qts30uaut2c8lp4dgh9usw2cdpgnznwd9
more info on the show: https://citadeldispatch.com
learn more about me: https://odell.xyz
A round of applause for Matt O'Dell, Derek Ross, and Sean Yeager. Okay. Yo. How's it going, guys? We'll be talking today about the future of digital comms, identity, social, and open communities. And my good friends, Derek and Sean, here with us. I think an interesting place to start is diagnosing the problem. We've found ourselves in a increasing digital world, And, the status quo has kinda just been built to, like, one foot in front of the other without really kinda, like, any real planning, and and and now we're here. So, let's start off with Sean. When you think about the current state of digital communities and identity and social, where where do you diagnose the problems existing?
[00:01:58] Shawn Yeager:
I think with as with most everything, as it admitted Bitcoiner, it starts with broken money. And with broken money come broken incentives, and from that flow, business models that turn, as we all know, us into the product. And, there has been an increasing, I think, drive to try to milk more out of the consumer user, and then the advertisers and the businesses. Cory Doctorow has a colorful phrase. Maybe I won't utter it here, but to describe how these cycles roll out. And so where we find ourselves is not only are we not, the user, we're the product, but I think increasingly we are seen as something to be packaged.
And we see creeping KYC, we see everything happening in The UK with the Online Safety Act, and, yeah. We're not in a good place right now.
[00:03:02] ODELL:
Very well said. Derek, how do you think about it?
[00:03:06] Derek Ross:
Well, I I think that over the past few years, we've all come to a place where we know somebody or we interacted online with somebody, followed somebody that has been censored or has been shadow banned, something along those lines. It's becoming more apparent and it's accelerating. It's kind of odd to see it accelerating. Like Sean just said, we're seeing that happen across the European Union, across, you know, in The UK. We're starting to see this actually even happen recently here in The United States where people can have their whole entire livelihood, their business taken away because they built their business on somebody else's foundation and they don't own that content.
They don't own that their followers. They don't own their entire social graph and it's disappearing overnight. Years and years of hard work can be taken away from you, and you can't do anything about it because you built your entire digital life on somebody else's foundation. And it's becoming very apparent that there needs to be a better way.
[00:04:25] ODELL:
Yeah. I think, there's a there's a couple of issues that compound on top of each other that result in the current trajectory that we're that we're going down in terms of big tech and digital platforms. So, I mean, you guys phoned in on on on censorship and control, which I think is one that people talk about a lot. So, Sean, you've been exploring, like, kind of this intersection between, you know, AI and Bitcoin. And the other piece here that is really interesting to me is is, like, this idea of deep fakes and verifiability. How do you think about that in the current paradigm?
[00:05:15] Shawn Yeager:
I think, I mean, and and just a a brief bit of background, hopefully not a shameless shill, is the the point of trust revolution is to pursue two questions. One is, how do we as developed nations find ourselves in low trust societies in that we I think most of us can agree, Pew Research and others would certainly back this up. We don't trust the government. We don't trust the media. We don't trust healthcare. We don't trust education. We don't trust each other. We don't trust across party lines. That's not a black pill. I think it's just observably true. The second more hopeful question is, how and where can we reclaim the trust that we have given or been demanded of and it has been broken?
And how can we build trust where we believe it should be? So that's all to say, can we trust our eyes? To your question, you know, can we trust the media that we see and we consume? I think what's hopeful about that is the ability to utilize public private key cryptography to sign, authenticate, attribute media. I think we're quite a ways away from that being large scale. I think once again the incentives are not necessarily aligned for that to be widely adopted, but I think the tools are there. And the big question in my mind, to echo yours, is at what point do we reach this inflection where there is so much questioning and confusion about is what I'm seeing real, that there's a broader adoption of of the tools that we do have, like Nostr, and these public private key pairs to address that challenge.
[00:06:44] ODELL:
But, I mean, are we aren't we kind of already there? In what way? There in terms of I think most people, like, when you open your phone, you're like, ah, is that real? Oh.
[00:06:54] Shawn Yeager:
Yes. Like, we're very close, if not already across the chasm, right? Yeah. Which, I mean, and I'll just say one quick thing there is I think much as in sort of prior waves of technology, there has been the need to create a certain literacy and a certain, ability to scrutinize. I hope that it it incentivizes and and motivates people to become more thoughtful, about what they consume and and and what they question or or trust.
[00:07:24] Derek Ross:
I think expanding on what you consume is a unique problem in itself because what content I want to consume versus what content I'm forced to consume is very different. Yes. Because we are slaves to the algorithms, and what these platforms want us to see. We don't really have control over the content. We don't have the control over our attention and that that's part of the problem too. So if you didn't want to see certain types of content, it's really hard to not see it using these existing legacy
[00:08:00] ODELL:
social platforms. You're being spoon fed. Yeah. Yeah. So I mean, from, like, a productive point of view, how do you how do you mitigate that? How do you actually solve that problem? I mean,
[00:08:11] Derek Ross:
that's Well, easier said than done. Yeah. It's easier said than done, but we need tools for users that allow them to choose their own algorithm, to choose the type of content they want to see, to choose and curate their their social feeds. Just because Elon and Mark Zuckerberg say that this is the content that you need to see doesn't mean that I want to see it, doesn't mean that you want to see it, but I don't have a choice if I use Instagram or Facebook or x, Twitter. Like I have to see that algorithm content. I I don't have a choice of choosing, you know, cat pics as my feed if I want to, you know, if I want a few cats or whatever it is. I I can easily sure. I could browse a hashtag or something like that, but that's not a good, you know, that's not a good choice. We need more user tools. We need more user choice, and there are options out there that give users full control over what they want to consume, full control over their attention. Because that's what these platforms are are monetizing.
They're monetizing our attention. Right? Like, we need a way to take that back. It's our it's, you know, what my eyes see. It's my attention. I should be able to designate what gets my attention.
[00:09:24] ODELL:
And do you think the the friction point with that because I do think that's the path forward. The friction point with that is it requires a level of personal responsibility from the actual user. Yeah. Sure. How do we handle that friction? I mean, if There's some people that just wanna scroll. Right? They they don't want the
[00:09:43] Derek Ross:
they don't have time to build and curate their own feed, and and that's fine. For that, you have a choice. But the fact that you don't have a choice is the problem. If you want the spoon fed content, great. If you don't want the spoon fed content, you want to be your own algorithm be in control, you should have that choice in a wide variety of choices. The choices the choices should be open and transparent, and you should be able to decide which path you want to go. And I I would say it's also experiential in the sense that if you're not on Nostra, if you haven't tried Nostra, What is Nostra? We haven't talked about that yet.
[00:10:19] Shawn Yeager:
What is Nostra? Well, so like bitcoin, and I'll I'll let Matt talk to this. It is an open protocol. No one controls it. No one owns it. And therefore, it is there to be built upon. And the reason I mentioned it is I think most of traditional social media and communications channels, one to many, they are not only monetizing our attention, increasingly they're monetizing our outrage. And I think as people that I've observed experience an alternative, Mastodon, others there are out there, I think we all agree that Nostra is the way to go. Once you remove the outrage, it is experiential that I feel better, possibly at least not worse as I have engaged with others on Noster versus X versus Facebook versus others. And so, that is all to say.
I think part of the key is just giving people a sense of what that's like. And I and I think they can begin, each of us, to sort of rewire those receptors, those dopamine, you know, hits that we're accustomed to getting. But it will take some time.
[00:11:28] ODELL:
I mean, you're drilling down on basically this concept of healthy usage of of technology. Yes. Which I would say as a society, we're probably very deep into unhealthy usage of of the tools. And, I mean, I see this firsthand with my own life. I see this across all different aspects of society right now. We have a term for that nowadays. It's called doom scrolling. Doom doom It became so apparent we have that. AI psychosis. Yeah. Doom scrolling. Everyone does it. A lot of people do it. They know they're doing it. They continue to do it. But one part on this one one aspect of this idea of of digital health and healthy usage that I think is incredibly key for our society going forward is all three of us are parents. It's specifically I mean, I think adults use it in very unhealthy ways, but but the question is, like, how does that affect childhood development?
And for something like Nasr, that's an open protocol that's not controlled by anybody. How do you think I mean, we'll start with Sean again. How do you think about handling that issue? Like, how how does how does society handle that going forward with kids growing up with basically just a fire hose of information?
[00:12:57] Shawn Yeager:
Well, I am here's my little guy right there, my my almost four year old. So, I'm a a dad to a young boy. And so I have a bit of time, but I'll just sort of maybe, share an anecdote, which is that we, full credit to my wife, had given, closure of your hair's life, had given, maybe an hour to two per morning of screen time so that, you know, she at home could have some some space to do some things. It is remarkable, the change, and this will be obvious to those of you who've done it, but it was remarkable to me that in saying no and and ending that and having zero screen time, the change in our sun was incredible. And I personally don't know of any better reference point in my life than to have observed that firsthand. So I can only imagine, what a a a young child given a device in their hand, that's not a judgment for anyone who chooses to do that, but I just can't imagine the damage that that will do.
So I feel very passionate about, our collective and individual, most of all, responsibility within our families
[00:14:09] ODELL:
to find better ways. So, I mean, we're seeing like, right now, we're seeing a lot of conversation about disenfranchised youth getting radicalized on Internet communities. It's become a very sensitive conversation. Some of the, quote, unquote, solutions that have been proposed involves restricting speech, restricting access More KYC. Adding digital ID, adding age restrictions. I mean, we just saw Blue Sky, I think, in two states just added age restrictions, to their app. Derek, how do you what is what is the most productive path forward? Because I I think the key here is that that is actually a problem. Like, I I do think disenfranchised youth are getting radicalized on niche Internet communities.
But when you're building out something like Nasr, an open protocol where you inherently can't age restrict on a top down level, how do you like what is the most productive path? How how do we actually solve that in a healthy way?
[00:15:21] Derek Ross:
That's a very good question and it's probably a very hard question. I I think I'll say part of it goes back to what Sean was alluding to is that, you know, ultimately parents should parent. If kids are having issues online getting radicalized and over certain content, you don't want that to happen to your kid then you need to restrict access to certain applications. Now that doesn't mean completely take away because we know that kids today are very social and online, so you can still give them apps. But so the second part of this is we just need more user controls and we need more apps across the the Nostra ecosystem that maybe do focus on restricting filtering that type of content. So maybe you have, because Nostr is widely open and you can do anything you want, maybe somebody builds a Nostr application that is more suitable for the youth, maybe restrict certain type of content. It's only bound to certain content filtered relays, and you can't use anything else but that. Now the argument is, well, the kid can take the profile, the NSAC and just use another app. But if you're the parent, you do parenting and you lock down access to certain applications, you only give them access to the parent approved app. I mean, they're your kids. You should be able to say what apps they use. And the personal example was that is I didn't let my kids use TikTok for a very long time.
And my kids are now 14 and 16 years old. They now use TikTok, but they wanted to use it years ago when their friends were all using it, you know, 10, 12 years old. And and I said, no, you're not using that app. I'll be sorry. And they complained a lot and I was a parent and said, well, I'm sorry, you're not using it. And I used my parental rights to restrict my kids access to something I didn't want them on. Now they're older. Sure. I I let them do it. And the same would go for any Nostra app. I would restrict access and block if I wanted to, the access to do that because we have the tools to do that. But then, as I said on the other side, we do need a Nostr client to step up and build a kid filtered, kid
[00:17:48] Shawn Yeager:
safe environment? Well, and I think just quickly the thing that's so powerful about this in my my strong promotion of Noster, or whatever may come after is the ability for individuals, for parents in this particular case to be given the tools to make the choice. Yeah. I think that's the core. It should not come from x. It should not come from the government. It should come from the individuals closest to and most invested in that little human's health. And I think Noster is a prime example of what an open protocol does with regard to giving us that power.
[00:18:22] ODELL:
Yeah. I think you you give you give parents tools so that they can parent better. Absolutely. And and have them take responsibility. And I it's bigger than Nosta. Right? Because, like Absolutely. I mean, it's kind of bewildering that you don't, like that Apple doesn't have built into the iPhone or whatever, like, really granular controls for parents to choose how their kids are interacting with these things. I think you you you bring it down almost the OS level. Right? Yeah. Like,
[00:18:52] Derek Ross:
because I'm a I'm a tech nerd, I know how to go in it on my router and block access to my kids' devices Right. To certain websites. It's I'll say it's easy, but is it easy for everybody? Probably not. So we need easier tools for everybody to use. Yeah. I agree.
[00:19:07] ODELL:
I mean, guys, this has been a great conversation. We've been a little bit more abstract, just to bring it bring it all back together and make it a little bit more actionable. To people here that have never used Nasr and maybe wanna play around with the test, I I think, you know, the best way to learn is to, you know, just get your hands dirty and actually use the tools. I mean, Sean, what would be your recommendation to someone who's
[00:19:33] Shawn Yeager:
interested in in seeing what's being built out there? Yeah. I'll take just a a brief moment of further abstraction and just say, I think what's so powerful about Noster and and some of the technology that underlies it is I'll steal someone else's, analogy metaphor is if you were a medieval king and you needed to issue a directive throughout the kingdom to your military to someone else, as you would probably recall, you would have a signet ring. That signet ring would be heated, pressed into wax, it creates a seal that letter is then delivered to Matt, the general, and my signet ring is my private key. It is difficult to mimic, difficult to forge, presumably hard to steal.
That's my piece of property that allows me to sign. The seal is the public key. And so that is all to say, in in these ways that have been created and recreated throughout time, Nostred gives you that ownership. Now, with that comes great responsibility. You own that key. You have that signet ring. And so, from that understanding that you can own your identity, you can own the ability to attribute, your creation or publishing of content, it can be quite simple. So I think primal is brilliant. I'll disclaimer format, ten thirty one, investor in Primal. Fantastic application. So primal.net, I think it's a great way to get started. I think it's one of the best consumer UXs.
There are many others depending on where you are on the spectrum from, I just want it to work Apple esque style to, you know, like us, we're nerds and wanna dig in. But I would say, in short, primal.net,
[00:21:11] ODELL:
take a look. Great recommendation. I think he handled that really well. Yeah. So while we have a little bit more time, just real quick, vibe coding, Nostr, AI, Bitcoin, that's where your focus is right now. Yes. Why is that powerful?
[00:21:29] Derek Ross:
Because so Soapbox is building tools that allow people that are creators or have their own community to build an application. You can vibe code it. You can build your own app for your own community and because it's built on Nostr, you can own all that content. So instead of using Discord or Twitter or whatever for your community, you could use Shakespeare to build your own community app customized how you've always wanted it to be and you own it. You own all the source code. You own all the data. It's decentralized. You can do whatever you want with it and nobody can take that away from you. Whereas if your Discord server gets taken down because you're a streamer or a musician or an artist or something, well, you're screwed. You can't do anything. But if you use Soapbox tools and you build Shakespeare, you can own every piece of the puzzle. Yeah. And the key there is you don't need
[00:22:25] ODELL:
closed API access. You don't need to Yeah. Verify You don't need to ask permission. You just do it. Yeah. You have the you have the social graph. You have the the identity layer. You have the comms protocol all in Nostra, which is basically like an open API for the world for that. Yeah. And then on the payment side, you have Bitcoin so that you don't have to, you know, get a a Stripe API or something like that to to integrate. No permission required. Just go do it. Yeah. You wanna build a website that accepts Bitcoin payments for your
[00:22:53] Derek Ross:
product that you're selling or for your personal website or something. You don't need to know any code. You don't need to be a developer on how to do it. You just have a conversation with AI and you and you say build me this website that does this thing a b c d and a few minutes later boom it's done. And it's yours and you can do whatever you want with it.
[00:23:11] ODELL:
Love it. Can we have a huge round of applause for Derek and Sean? Thank you guys. Thank you.
Future of digital comms, identity, social
Diagnosing the problems: incentives, KYC, and broken trust
Censorship, shadow bans, and owning your social graph
AI, deepfakes, and can we trust what we see?
Algorithmic control vs user choice
Introducing Nostr: open protocol, healthier engagement
Digital health, doomscrolling, and parenting challenges
Youth safety on open protocols: responsibilities and tools
Give parents the tools: OS-level controls and UX
Getting started with Nostr: keys, Primal, and UX spectrum
Vibe-coding apps: Soapbox/Shakespeare on Nostr + Bitcoin
Permissionless payments and AI-built sites