Alex Gleason was one of the main architects behind Donald Trump's Truth Social. Now he focuses on the intersection of nostr, ai, and bitcoin. We explore his latest tool, Shakespeare, which enables anyone to easily vibe code an app in their browser. I vibe my first app live on air.
Alex on Nostr: https://primal.net/p/nprofile1qqsqgc0uhmxycvm5gwvn944c7yfxnnxm0nyh8tt62zhrvtd3xkj8fhggpt7fy
Shakespeare: https://shakespeare.diy/
Soapbox Tools: https://soapbox.pub/tools
The app I vibed live: https://followstream-3slf.shakespeare.to/
EPISODE: 174
BLOCK: 910195
PRICE: 853 sats per dollar
(00:00:01) Treasury Secretary Bessent Intro
(00:01:29) Happy Bitcoin Friday
(00:05:12) AI and Freedom Online
(00:07:04) Shakespeare: Vibe Coding Made Simple
(00:08:03) Concerns About Big AI
(00:15:05) Self Hosting AI and Technical Challenges
(00:22:24) Energy and AI Development
(00:28:14) Building Personalized Experiences with AI
(00:38:02) Nostr's Future and Mainstream Adoption
(00:45:02) Decentralized Hosting and Shakespeare's Future
(00:54:01) Collaborative Development with Nostr Git
(01:02:24) Open Source Renaissance and Future Prospects
Video: https://primal.net/e/nevent1qqstzds6pmkpaser62kme8dk74r4ea4ae3hv9fr2wur0kpc3yyws96gx2pa59
more info on the show: https://citadeldispatch.com
learn more about me: https://odell.xyz
You're talking about a number way above 750,000,000,000. What's the story on gold?
[00:00:09] Unknown:
Look, we we've got this the large gold holding. I I doubt we're gonna revalue it, but we we are going to keep it there, again, as a store of value for the American people. We've also started, a to get into the twenty first century, a Bitcoin strategic reserve. We're not gonna be buying that, but we are going to, use confiscated assets and continue to build that up. We're gonna stop selling that. You know, I I believe that Bitcoin reserve at today's prices is somewhere between 15 and 20,000,000,000.
[00:00:46] Unknown:
And and, mister secretary, all of these ideas to monetize the assets.
[00:01:29] ODELL:
Happy Bitcoin Friday, freaks. It's your host, Odell, here for another civil dispatch, the interactive live show focused on actionable Bitcoin and Freedom Tech discussion. That intro clip was treasury secretary percent. I think it was a Fox Fox News interview yesterday that sent a bunch of people into a tizzy, because he said he wasn't gonna buy Bitcoin or the government wasn't gonna buy Bitcoin, and that they were gonna confiscate Bitcoin to add to the strategic Bitcoin reserve. The second part was not really news, but really sucks to hear it out loud.
Probably will always suck to hear it out loud. He did say that they have about a 125,000. Kind of a questionable number considering I'm pretty sure they still have a 100,000 Bitcoin that they recovered from the Bitfinex hack, which should be returned to Bitfinex and Tether. So they might have way less Bitcoin than they think, or they might be playing a pretty fucked up game and not giving that a 100,000 Bitcoin back to Bitfinex. Either way, it seems like percent clarified his comments later and said that they still intend to acquire Bitcoin through budget neutral ways, whatever that means. This is completely unrelated to the show, but it felt like an important time stamp to put here. I would just remind people.
I actually it actually is kind of relevant to our show. We have Alex Gleason joining us again, for we're gonna do updates every two to three months. How's it going, Alex?
[00:03:08] Alex Gleason:
Hello. Hello.
[00:03:09] ODELL:
I did your intro a little bit early because, you know, as you might have remember from our last conversation with Alex, which, like I said, happened about two months ago. I think that was episode one sixty four. Give that a listen if you haven't yet. He helped build Truth Social, and the public company that is Truth Social now has 15,000 Bitcoin. So Trump has packed his bags. Him and his cronies have a shit ton of Bitcoin. I think that's really the only bullet point you need in terms of wondering whether or not they're gonna try and pump the living daylights out of Bitcoin. I mean, the self interest is clearly there.
But I skipped a step. Freaks, as always, dispatch is brought to brought to us by viewers like you. We have no ads or sponsors. Thank you to everyone who supports the show by donating Bitcoin. The most fun way to do that is through Nostr, either in our live chat or through your favorite Nostr app. If you've never used Nostr, I suggest try and download Primal. Download the Primal app in your favorite app store. Give it a spin. Give your commentary. Interact with the show. It's a great way to participate, and I just love kicking it with you freaks. Another great way to support the show is fountain podcast or any other podcasting two point o app.
The largest app from last week was Nostra Kaye with 21,000 sats saying highly speculative and degen, but obviously enjoyed stay humble and stack sats. Great advice. And our highest sap on Noster was Mav 21 with 10,031 sets. Thank you, Mav 21. Anyway, guys, all links to the show is at civildispatch.com. Everything relevant is there. Share with your friends and family. It is helpful. Alex, good to have you back.
[00:05:06] Alex Gleason:
Happy to be here.
[00:05:10] ODELL:
How's everything been?
[00:05:12] Alex Gleason:
It's been good. So when we last talked, you know, we were talking about stacks, and we were talking about people vibe coding stuff on Noister and just, like, the potential that that can have for the future. I think it's one of the it's like the frontier of freedom online right now is AI. And and so I'm I've been building tools to make it easier for people to use AI to build their own sort of, like, you know, worlds online to be able to have that freedom. But I'm also I've also been feeling concerned about big AI and about how we can fight against that.
And so these are kind of some of the things I've been working on and and thinking about. And most notably, since our last talk, I have launched Shakespeare, and that's shakespeare.diy. And that is a a vibe coding tool that people can use to develop Noister clients. I'm listening. Yep. And so that's been that's been really exciting. We've had people building, you know, just dozens of different Noister clients with this with this tool. And what's great about it is you don't have to have any technical knowledge or experience. You can just you know it's similar to other solutions like Lovable, VeoDuro. You can kinda just tell it what you want, and then it will build it. And and so I have a v one of this right now, which is what is live, and I'm currently working on a v two of Shakespeare that is gonna give people even more freedom and control, because the new version is gonna run entirely in the browser.
[00:07:04] ODELL:
That's badass.
[00:07:07] Alex Gleason:
Yep. So
[00:07:09] ODELL:
Yeah. Sorry. I'm having technical difficulties over here that I'm trying to Okay. Troubleshoot live on air. I think I got us to a decent spot. Let's see if this works. I lost my live chat freaks. It's all about the live chat on dispatch. Inspect. Delete. Okay. I guess I got a Vibe code, a new, live chat set up. I wanna talk about, I wanna talk about the concerns about big AI real quick. Sure. Because this is something that I'm intimately concerned with. Look. Look at that. There we go. We got our live chat back. I, so it's it's an interesting phenomenon, twofold. So I'll I'll start the I'll start the discussion here.
First off, it's interesting that, in The US, we probably are in the lead in terms of models and tools, but it's all proprietary and closed source except for recently, OpenAI finally released an open source model. That's decent, but not nearly as good as the proprietary closed models. And then China, on the other hand, is, like, just aping into open source. They're kinda also playing catch up, but, like, all the top open source models are really out of China. Meta had an open source model. Llama, that's just complete shit. Doesn't really is not really competitive. The one caveat we have on the American side is, you know, Block came out with Goose, but that's on, like, the agent side. It's more of, like, a front end. It's not the actual model itself.
And then to top it all off, like, there's a lot of speculation on, like, okay. What will LLMs be capable in the future? What kind of new functionality will they provide us? But in the short term, the single thing that they're very, very good at is, analyzing, like, mass hoards of harvested data. It's, like, perfect for the surveilling like, levels up their surveillance state almost instantly without any improvements really needing to be made. So how do you think about that? Just like the general I mean, I kinda gave you two different things, but
[00:09:46] Alex Gleason:
Oh, I I I have, like, ten ten different things I wanna say. Hopefully, I can keep it all in my in my head and say it all. Go for it. On GPT's open source models, I noticed that they don't dog food them at all. You can't even use the their open source models through their web UI, which is really annoying. Like, it's it's almost just like a, you know, just like a bone that they're throwing to somebody, maybe to Trump. Because in, in Trump's AI, like, for America document, it actually listed open source as one of the main ways to make America AI win. And I thought that was really interesting, rare w.
But so maybe it's, like, just trying to appease that. In terms of China leading open source models, yes. And also, I think that this kind of proves that, like, the the how open source can be a thing that gets your foot in the door to make it something that's relevant. Right? Like, they don't have the best models, but they have open source models. So, like, people are seeing what they're doing. Right? And I'm sure that in China, there's all kinds of organizations making models that are not open source. But we we only hear about the ones that are open source because open source is what allows them to win.
[00:11:06] ODELL:
Right. And and this kind of overlap Like, I would never use it. I would never use a Chinese closed source model. I mean, I'm probably also one of the few that would never I I don't use the proprietary US models either, but I definitely wouldn't use a Chinese closed model. And then I end up using the Chinese open models because it's the best open models that are available.
[00:11:25] Alex Gleason:
Yeah. And and so I've been experimenting with some of those. I tried Kimmy k two for, like, a week, and I was really impressed with it at first, but then I ended up going back to Claude because it just wasn't there. And I've been researching, like, how do you run this thing? And it's like, okay. Well, you need a million dollars worth of graphics cards. So and and that's just that's just the initial cost of the hardware. That's not even the continual running cost of it. So, like, that's a barrier. And and I and then I'm saying, like, so can you tell me about Tiananmen Square? And it's like, Tiananmen Square? I don't know what you're talking about. And and so it just felt to me like it even though it's open source, it violates the principles of open source by being something that's literally impossible for normal people to run while also having censorship built in. But then again, the American models also have censorship built in on on all kinds of different things too.
And so okay. So Kimi k two, it wasn't really it. But then recently, z AI came out with GLM. And this is the one that's that has been finally the one to push me over onto onto the Chinese AI side. And now I'm, like, I'm gonna be speaking Mandarin in in a year. I'm gonna be, like, in Taiwan. And I feel like China has entered into my life, and I'm really curious and interested in thinking about China a lot. Probably I'm thinking about China more now than I was when I was working with Donald Trump. Well
[00:12:53] ODELL:
well, let's let's just pull on that thread for a second, this thought on, because it's not really censorship. Censorship's the wrong word. We almost need a new word. But when you're using these models, like, the way they're trained or what?
[00:13:09] Alex Gleason:
I think it kind of is censorship because they they train it to specifically not respond to certain requests. Or respond a certain way. Right. And and I think, like, in Tiananmen Square, there's training data that instructs it. I mean, it it's it's self censorship in a way in the same way that you you would tell, like, a person that they're not really censoring us. Yeah. They're not censoring you. Right. It's true. They're blocking our access to the information. You're right.
[00:13:37] ODELL:
But it's just so, I mean, Tiananmen Square is a good, example, and I guess I'm sorry for the Chinese listeners we have because this episode is probably just gonna get, like, blackballed all over there right now. But, it's a very extreme example. But the the crazier thing to me is, like, the more nuanced examples that are happening on in the West that are harder to pinpoint where they're happening. And sometimes it's probably even unintentional. Right? It's like, where are they getting the training data from? Like, they could get the training data from Wikipedia, and Wikipedia could have a slant or a specific Wikipedia entry could have a slant, and then the model has a slant.
Yep. But that's pretty dark. It's because it's so hard. It's, like, not even book burning. It's, like, the book didn't exist.
[00:14:29] Alex Gleason:
I you know what I mean? It's, like, kind of a weird thing to wrap your head around. Yeah. It's funny because if China would just give up on this one thing and just be like, yeah. Yeah. Tiananmen Square happened, then, like, it would be so much easier to just ignore all of the little things that Right. That they do that are still That's the western way. Exactly.
[00:14:46] ODELL:
Yeah. Yeah. I wonder, like I mean, to me, like, there's something there about so I first went down, like, the and I don't really use I've I I know I promised you that I was gonna start vibe coding by the time I had you back on, but, fortunately, we've had you on sooner. And second of all, I haven't actually held up onto my side of the bargain. So I don't really use I don't I don't use these tools for that. I use them more, like, chatbot as, like, chatbots and and, you know, like, virtual assistant type of scenarios. And I mostly use Maple. So I first started by self hosting.
Right? And I would run these models, and I'd have to run, like, the really gimped models because my computer is not $5,000. You know? And I and then I switched to Maple because they give me pretty strong privacy and security guarantees. It's, like, kind of a similar model to signal. But I can use, like, the latest and greatest open models. And the cool part there with Maple is, like, I can pick which model I use. Right? It's got, like, a drop down of, like like, seven of the top ones or whatever that they support. And I think there's something there, and we're not quite there yet, but there's something there about kind of running them in that environment. It's harder to do it in a self hosted environment, but in that environment, kind of running them simultaneously to check each other's work might be it's not really a solution, but it might be the mitigation to that kind of foundational level bias or slant or censorship, whatever you wanna call it.
[00:16:24] Alex Gleason:
Yeah. I've been just trying to figure out how how to self host because all of these services people are using are are through some, you know, API that's run by somebody else. And I really just I really just wanna host it my myself. Like, you know, I feel like ten years ago, the fight was Twitter, and we need to self host Twitter. And so, like, Mastodon was sort of the biggest first introduction to that, which is when I got started in this movement. And now there's just so many ways to self host Twitter that it it's become this saturated market. And now with Shakespeare, you could just, like, vibe code a kind one client Right. However you want. So now it feels like the big impossible thing that I'm reaching for is self hosting AI.
And how can we do this in a way to where people can actually have control over this revolutionary and potentially dangerous technology, dangerous if they can't if normal people don't have the means to control it themselves.
[00:17:29] ODELL:
So how do you I mean, how do you think that looks besides waiting ten years for, like, hardware hardware costs to be lower. Like, if everyone's if if the requirement is a $5,000 computer, there's gonna be ten years much higher than a $5,000 computer. But even just I mean, even to be, like, in relative competency, like, to to even be close to the completely you know, not only hosted, but, like, controlled walled garden KYC walled garden type of stuff, you need to have a very expensive piece of hardware.
[00:18:06] Alex Gleason:
Yes. Exactly. And and so, I mean, there are advancements that people are making. In particular, Chinese open source models are making these advancements. So all of the main, like, big tech big AI companies are using dense transformer models, which is where, basically, whenever you ask it a question, it's responding to you with all of the world's information in that one response, and it has to activate all of the world's information, which requires electricity, which comes with a cost. Right. But these open source Chinese AI models are using a mixture of experts, approach in which only part of the AI model needs to be activated to to respond to your query. So if you're asking it a question about football, then maybe it only needs the sport expert to respond to that query. And the results is that it can be a lot cheaper and use a lot less electricity to respond to that particular query because only part of the model is activated.
And, big AI companies are afraid to do this approach because they are all racing for AGI. And it's harder it's harder to to have a model that is that works really well when you have to balance this mixture of experts thing, as opposed to just having having it respond with all the world's information on each request. So this is just an example of, like, ways that the model itself can be structured differently so that it be can become more accessible to people. But the accessible version of this, after all of this is done right, requires $1,000,000 worth of GPU hardware.
So so, like, this is this is step one here and, like, how do we get to a step where even $5,000
[00:19:56] ODELL:
is good enough? I don't know. And it's even worse than, like, it requires $1,000,000 of the GPUs, but it to me I mean, I look at everything from a Bitcoin perspective. It reminds me of ASICs. Like, you pay a million dollars of the GPUs, and then in, like, four years, they're worth, like, a 100 k. Like, it's you're just, like, completely burning money. It's not like it's like you're investing for your family or your compute community and you're, like, good for a decade.
[00:20:22] Alex Gleason:
Yeah. It's it's unfortunate. But my I my hope is that the models are gonna get cheaper. Another thing also that I've I was reading about this morning is that in in China, they're they have, like, so much energy that they don't know what to do with. Apparently Yeah. They they are running on 80% reserve at all times or higher. And in The US, like, in Texas here, twice a year, the power grid fails and we lose power for, like, a week because it rained a little bit. And so it's also cheaper to just consume this electricity in China because they have so much energy availability there.
[00:21:07] ODELL:
Yeah. I mean, that's a battle that's a battle that big corners have been fighting for a while. We need more energy production. It leads to human flourishing. China's way ahead of us in that queue. And this the stat is I might be I I'm definitely wrong, but it's something like China's adding, like, a full American America worth of capacity of energy every year or something insane like that. Yeah. Crazy. I I mean, it's interesting. Right? Because, like, on our side in America, we're constrained on the energy side, And that should probably change now that big tech is finally not concerned trolling Bitcoiners on energy usage and instead encouraging more energy production. So we're, like, finally aligned with, like, the Mag seven, for, like, the first time in over a decade, on that point. I mean, they were all the proof of stake maxis for a while. So that should hopefully at least start trending in a better direction.
But so, like, we're energy restricted, and then they're chip restricted because, like, the export controls and whatnot. So we're they're on both sides, we're kinda operating under different and so then as a result, on on different, constraints. And so as a result, the model's kind of and, like, the the strategy is kind of modeled basically off of those constraints. Right?
[00:22:24] Alex Gleason:
Right. But and and this this is why, ironically, China's models they're developing are more efficient than our models. Even though they have access to more energy, their chip restrictions are preventing them from being able to have enough GPU power and to run these bigger models efficiently. So they're kind of being forced to do these mixture of expert things, but then that ends up being better for us because now, like I mean, a million dollars is not, you know, inaccessible here. It's just like it it it it's much lower of a barrier than it it would be to run GPT five.
[00:23:01] ODELL:
Right. Okay. All this is fascinating. So let's talk about Shakespeare. I see someone is trying to use it and says that their lightning path is failing. But, so you might wanna check your lightning note. I don't know how you're processing payments.
[00:23:17] Alex Gleason:
It's through Alby.
[00:23:19] ODELL:
Well, that there's your issue.
[00:23:21] Alex Gleason:
Alright. Maybe someone can someone in the community can help me out.
[00:23:27] ODELL:
Yeah. We'll figure I mean, you're you're running your own node? Are you self hosting your own node? It's it's AlbiHub. So, technically, yes, but I'm not really doing anything. Right. But it so it is your own node with your own channels?
[00:23:39] Alex Gleason:
Yeah. It is.
[00:23:40] ODELL:
Okay. For example, here's here's a A bigger one? Yeah. Here's here's a call to action. Either look up his node or I see Ditto in the comments. Either look up his node or I'm gonna put his node in the show notes. Let's, open a bunch of liquidity to his node so he doesn't have payment failures anymore. You understand, like, the concept of lightning. Right? Like, you need to have inbound liquidity and outbound liquidity. Inbound's much harder to get. But if you're if you're a good product or service, people generally will open inbound to you just so that no. I'll be I see decoy. I'll I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I'll be I but it's for managing your own node. So at the end of the day, your issue is, I thought at first, I thought he meant he was using Albi's old custodial wallet, which they've been phasing out. It might be completely phased out. AlbiHub just at the end of the day, you're managing your own lightning node. So anyone who's managed a lightning node knows it's a mental burden, and inbound liquidity can be a pain in the ass.
And that's clearly where we're at. But, anyway, the good part is in this thing.
[00:24:48] Alex Gleason:
Because, you know, I have to pay the AI cost from the upstream providers in US dollars, but the majority of people using Shakespeare are paying me in Lightning. And so all of you people who are using Shakespeare and paying with Lightning, you're converting my USD into Bitcoin for me. So thank you.
[00:25:08] ODELL:
That's a great way to get an okay way to see Bitcoin. There you go. It's a great way to stack sets. I've always said that the way we see merchant adoption happen is actually from that direction. It's merchants who want Bitcoin rather than users that want to spend Bitcoin. You kinda can't force it. But if the merchant wants Bitcoin, then it makes a lot more sense. Okay. So let's talk the rise of the the vlog. First of all, do you know how to pronounce v l o g?
[00:25:38] Alex Gleason:
I believe it's vlog.
[00:25:40] ODELL:
Yeah. But that doesn't make any sense. It should be video log. Right? So it's vlog. So I'm like you over this on the Internet? Yeah. Suppose I've I've Sorry. I I learned how the pronunciation happened on my other show yesterday. So I've been I've been getting bullied. Do you know what the b in blog stands for?
[00:26:00] Alex Gleason:
I didn't know that it was an acronym.
[00:26:03] ODELL:
The b in blog stands for web. It's a web log. Oh. So if you go down that route, it's a soft b, web blog, and then video log is a hard v, so it should be v log.
[00:26:19] Alex Gleason:
Video. Video log. I don't know. I'm wrong. So blog should be blog,
[00:26:25] ODELL:
basically. Web. Blog.
[00:26:28] Alex Gleason:
Yeah. Blog.
[00:26:29] ODELL:
Blog. Yeah. I don't know. Well, anyway, the rise of the vlogs. Anyway, this is our our live vlog or vlog. And then what's the difference between a vlog and TikTok? Isn't TikTok just like the modern vlog?
[00:26:46] Alex Gleason:
I I hadn't I had not really thought about that before, but I guess so. Maybe I mean, I'm sorry. It's you it has to be, like, you have to be describing something. Right? Like like, it has to be, like, autobiographical.
[00:26:59] ODELL:
I don't know. It doesn't matter. I thought you were gonna ask me about Nostra versus Nostra again. And then But we also I mean, we can disagree. We can agree to disagree on that one because you did pronounce it wrong. Okay. So Shakespeare, what does that look like right now? How does what what has changed this got released since our last episode. Right? It did.
[00:27:21] Alex Gleason:
Yeah. So this is this is the way that people previously, you had to use a terminal to use MKStack. Now Right. People can just go on the web. You can do it on your phone. And and it looks like, you know, a chat g b t UI. And you can say, build me a site like Twitter on Noister, or build me a site like Instagram on Noister. Or ideally, you'd say something like build me a, Gnostr community for my Pokemon, group, my Pokemon club, and then add a Pokedex to it. Or, like, build me a nosier commute community for farmers and add a marketplace to be able to buy and sell tractors.
And so, like, it's just gonna allow us to create these extremely personalized, customized experiences with AI. And it's kinda just waiting for artists and, you know, product people to come in and just write the the right terms to make Noister win.
[00:28:30] ODELL:
Oh, this is awesome. So, like, everything can just happen right in browser. Like, I don't need to download anything, or I just pay you and just in browser, I just start vibing?
[00:28:41] Alex Gleason:
Yep.
[00:28:43] ODELL:
And so then what's going on in the back end?
[00:28:46] Alex Gleason:
So right now, I I have a server rack in my living room, and that's where all of the files are stored. And so and and that thing is talking to the AI behind the scenes. It's just connecting to to, you know, some upstream AI provider. I've been switching it between Claude and GLM and, you know, testing different providers. But so your your, browser is going over Nostra relays to communicate with this service provider that is then doing all the AI stuff and managing your files. But I've been this is act one, and I'm currently developing act two, which changes the architecture so that the so that the files will actually be in your browser instead of in my server. I'm, like, working on decloudifying it right now, if that makes sense.
[00:29:39] ODELL:
Got it. So what's my NSP? My master service provider?
[00:29:46] Alex Gleason:
Yes. I'm logged
[00:29:49] ODELL:
I'm logged in right now.
[00:29:51] Alex Gleason:
Cool. So now you should be able to click on the NSP drop down and add funds to it, and then you should be able to prompt it.
[00:30:00] ODELL:
Just blindly authorizing things for my extension while I'm live on the show is probably not the best idea.
[00:30:08] Alex Gleason:
And it's, it'll just start running once you give it a prompt, and it usually takes about ten minutes, maybe fifteen, depending on the complexity of the prompts to finish this because it's kind of like you're just telling a developer to do something, and it's the fastest possible developer, you know, known to humankind.
[00:30:26] ODELL:
Okay. So I see in my NSPs, there's Gemini Flash Playground, there's MKStack, there's Soapbox Playground, and there's MKStack dev.
[00:30:39] Alex Gleason:
So if you the Gemini Flash Playground is a free one that you can use to test things, but the experience is what you would expect from something that's free.
[00:30:49] ODELL:
Okay. So the MKStack is the best one to use? Yes. And that one requires
[00:30:54] Alex Gleason:
you to add funds to it before you can use it.
[00:30:59] ODELL:
Open payment page. It brings me to Stripe.
[00:31:02] Alex Gleason:
There's a selector box there, which you can click to do lightning. And I've been thinking about switching the order code tabs.
[00:31:11] ODELL:
I see. So the the Bitcoin one is actually the default tab. It'd be, like, the lamest thing ever if I paid with credit card. Continue to pay. Well,
[00:31:20] Alex Gleason:
yeah. I've enabled, Fiat over Nostr, unfortunately or fortunately.
[00:31:26] ODELL:
Let's see. You got I mean, look, at the end of the day, you gotta accept Fiat. I like the idea of,
[00:31:32] Alex Gleason:
Kinda my thinking.
[00:31:33] ODELL:
You should I mean, first of all, prioritize the Bitcoin side, but give people a discount if they pay in Bitcoin.
[00:31:40] Alex Gleason:
I like that.
[00:31:42] ODELL:
Alright. You don't have liquidity. Right? So I can't, someone says they're were Ditto says they're working on it. And, that's you. Right? Is that your team?
[00:31:52] Alex Gleason:
Yeah. It is.
[00:31:54] ODELL:
Or you just in the live chat typing? I guess that's your team. Right? That's my team. Okay. I can't pay for it. But, so so presumably in so I can just so so what what you're connecting to some kind of API provider for the actual LLM on the back end. Right? And so is that Yes. Exactly.
[00:32:15] Alex Gleason:
So it's configurable. Something that's really cool is that most of the AI providers now have just standardized on OpenAI's API. So they're you can just drop and replace them. And so, like, two days ago, it was clogged. Right now, it's GLM. It's z a I.
[00:32:38] ODELL:
Make me a Nasr app that just shows images.
[00:32:45] Alex Gleason:
Do I have to be, like, more descriptive than that? Nope. Because Let's see what happens. What you'll see is it it actually knows to read the nips. And the first thing it'll do is it will check the nips to see if there are any that could pertain to your request. And then if and then if there are, then it it will read those specific nips.
[00:33:04] ODELL:
Interesting.
[00:33:06] Alex Gleason:
And you mean, we should be able to see it doing it.
[00:33:08] ODELL:
Yeah. I'm watching it. Well, I'm in the free one because you wanna take my money right now.
[00:33:13] Alex Gleason:
That's okay. It
[00:33:15] ODELL:
should mostly But, yeah, it's gone. Well, let's see if, replace screen sharing window.
[00:33:29] Alex Gleason:
Okay. So the first thing it did was it checked the project structure, right, at the very top? Okay. Yep. And so now it's looking at the package of JSON.
[00:33:43] ODELL:
Dude, this is pretty fucking awesome.
[00:33:47] Alex Gleason:
This one? Alright. I guess it decided not to read the nips. The thing about Flash, Gemini Flash, is that it has been optimized to do things quickly and to ignore instructions.
[00:34:01] ODELL:
Well, that's not good. Yeah. Why does this say $999? Is that just the way you hacked it to be free? Yes. And does that have a way to work with it? Oh, okay. Now it's checking the notes. It is reading the notes. Good. Alright. Is and so this is saying it's 21% along the progress right now?
[00:34:21] Alex Gleason:
That's the context window. And so each AI model has a limit to the size of the chat history before it can't continue in that chat anymore. And so once that hits a it's kind of like a reverse health bar I've heard it described as to where once it hits 100, then, like, this robot must die.
[00:34:45] ODELL:
Got it. So it's actually not a progress bar. It is just telling me Hopefully,
[00:34:51] Alex Gleason:
it doesn't hit a 100% because if it does, then it's you're gonna have to start over.
[00:34:57] ODELL:
Well, I gave it, like, the simplest fucking prompt ever. Yep.
[00:35:02] Alex Gleason:
So which which NIPS did it read?
[00:35:05] ODELL:
NIPS 68.
[00:35:08] Alex Gleason:
And then you should be able to click and see. Your first feeds. Okay. Great. So it's gonna be it's gonna be like, like, Instagram.
[00:35:16] ODELL:
View full NIPS document.
[00:35:19] Alex Gleason:
Yeah. That's the, I forget. What's the name of Pablo's?
[00:35:24] ODELL:
Olas.
[00:35:25] Alex Gleason:
Yeah. Olas. It should be Olas compatible.
[00:35:28] ODELL:
Olas is NIP 68?
[00:35:30] Alex Gleason:
I believe so.
[00:35:35] ODELL:
This is pretty cool.
[00:35:36] Alex Gleason:
Yeah. And so this could take ten to fifteen minutes probably to finish, and then once it does, it'll pop up
[00:35:45] ODELL:
Okay. Your team just messaged me in the in the chat. They just gave me credits because they're trying to figure out the lightning node. Oh, awesome. It's not loading now. I had this happen to me before. Okay. Let's stop screen. Stop screen sharing. Have you seen that error?
[00:36:09] Alex Gleason:
Where it goes white? When you refresh? It's working it's working on my end.
[00:36:19] ODELL:
Well, I'm gonna show well, it's only when I it happened to me when I clicked refresh.
[00:36:24] Alex Gleason:
Yeah. I
[00:36:26] ODELL:
I'm gonna show Just, like, login with extension is so easy. So I I wanted to okay. I'm back. I'm back. MK stack. Nice. I got a $100. $100 in there. Great. Okay. What should my prompt be? Build me an do I have to say a Nostra web app?
[00:36:45] Alex Gleason:
No. Because it already knows that. So It's always a web app. Right? Yeah. It doesn't hurt.
[00:36:52] ODELL:
That shows full screen images as they are published only from people I follow. Does that work?
[00:37:13] Alex Gleason:
Yeah.
[00:37:16] ODELL:
Okay.
[00:37:19] Alex Gleason:
Send vibe. I've been I've been doing a lot weirder stuff, like, you know, build me a Nostril site where I can trade Pokemon and, like, build me a Nostril badges site and things like that. Do people still use
[00:37:33] ODELL:
Nostril badges? Is that the thing?
[00:37:36] Alex Gleason:
I really like the idea of it, but I think that it's one of those things that's, like, well, it's kind of not a priority for a lot of projects. But if you can just vibe code it in five minutes, then why not? Right.
[00:37:53] ODELL:
Fair enough. I mean, I like the concept of badges. So do you think that, like, people are just gonna vibe their own apps and that's how they're gonna interact with Nostr?
[00:38:03] Alex Gleason:
I do. I mean, I I kinda feel like Nostr is at this turning point right now, and I I think that, like, people just aren't buying what we're selling. You know? Like, the world just isn't the world doesn't want it. I want it. We want it. Our community wants it. But in terms of mainstream adoption, like, something's gonna have to happen. And I'm hopeful that if we just wait long enough, then some major world event would occur that would cause people to flood to to it because of, like, you know, some big censorship issue. It's kinda happening right now, right, with, like,
[00:38:49] ODELL:
the Online Safety Act or whatever out of The UK? Yeah.
[00:38:53] Alex Gleason:
But it just feels like we've kind of passed that point. Like, we've already had these major events that have happened, and we've already had mass migrations. And I'm not sure that the same type of thing is gonna happen in the same way again. And and so I've been thinking about, okay, maybe I need to change my point of view here. Instead of being this sort of, like, war hero type of person, that I've perceived myself as being before where I'm I'm, like, going in and and it's this big political battle. Maybe I need to position myself more as a product person who is building something that is actually interesting that people wanna use in and of itself.
And then, like and the great thing is that this only really works because of Noister. So
[00:39:50] ODELL:
Right.
[00:39:51] Alex Gleason:
It's not like Noister is just a side effect, although it in some ways, it is. But in other solutions, people are struggling right now to get sign in to work on their Vibe coded apps because because they need, like, a Twitter API and shit. Right? Right. They like, or like email. Right? Like, you need to you need a back end, basically. And that's something that AI tools are struggle to build is a full stack application with the back ends. And the people who are building those things, I feel they are fighting hard against a strong current.
[00:40:31] ODELL:
And it's, like, super insecure. Right? Like, they had, like, that app where, like, the women were raiding the men or whatever, and then, like, all the KYC docs were just in in, like, an open Firebase folder or whatever. A great example.
[00:40:46] Alex Gleason:
But with NoSir, like, NoSir flips the authentication, and it flips, like, where the protected resource actually is. So so authorization happens from you as the user rather than, like, from the server. And the servers are permissionless. They're relays, as long as you can identify yourself using your key. And so because of the design of Noester, it allows us to build Vibe coded apps where sign in just works the first time. And it's not very hard. It's easy.
[00:41:25] ODELL:
So And then you get the social graph. You don't need to get any APIs. Exactly. I mean, it's really I and then you mix Bitcoin into that puzzle, and then all of a sudden you have payments. Yep. So you don't have to worry about integrating Stripe or whatever and everything that that entails. So it just makes so, like, maybe that is the biggest selling point of Nasr. Maybe that is the the actual Lindy selling point of Nasr and not censorship resistant. The censorship resistant part is just a side effect. It's just being able to vibe without permission.
[00:42:00] Alex Gleason:
That's kind of what I've been thinking is is, like Oh, I think it's done.
[00:42:05] ODELL:
Sorry. Continue.
[00:42:06] Alex Gleason:
This is the future of technology. And so if we can get in on on that, then, you know, it's gonna matter.
[00:42:20] ODELL:
Is this my app on the left on the right? Yes. I did. No. It's two x isn't gonna show my private key up here. Right? Oh, you can't even see it. Nice. Let's see. Oh, okay. Let's see if it worked.
[00:42:39] Alex Gleason:
Oh, alright. I was a little bit worried at first. That's pretty cool.
[00:42:48] ODELL:
That's dope. Damn. You made it a lot easier than it was before. Whew.
[00:42:58] Alex Gleason:
Shout out to you, Glenn.
[00:43:00] ODELL:
On the live demo. So that only cost me $2.26. Well, technically, it cost your team $2.26. I'll I'll I'll, reimburse you guys. That's pretty crazy.
[00:43:13] Alex Gleason:
I think you probably sent me more Zaps, you know, over the years.
[00:43:18] ODELL:
Well, we'll keep we'll keep the balance correct. I'll make sure. So so okay. So I have this. Is this just a regular web link? Yes. It is. So I could share that link to this new web app that I just created. I can share that with, like, friends and family, whoever. I could post it, and then anyone could use this app.
[00:43:41] Alex Gleason:
Exactly.
[00:43:43] ODELL:
Now how does that work in terms of your hosting cost? Are you just eating those right now? Or
[00:43:50] Alex Gleason:
Yes. But because it is a static, website, it it has not really cost me anything in terms of of CPU. It's only costing me in term or RAM, really. It's only costing me in terms of disk. So, you know, I need to have but I'm running this all on my on my server rack in my living room. So, like, if I run out of disk space, I don't have recurring monthly cost for disk. I just need to buy have a one time cost of buying a new disk and putting it in there.
[00:44:27] ODELL:
Got it.
[00:44:28] Alex Gleason:
So the cost is not great. Probably more than anything, there's a sort of ongoing maintenance burden of this thing of just ensuring that it's constantly running and working correctly. It's like your Trump bot. It's like you close your laptop, and then we don't get Trump Exactly. Trump post. And so this for that reason, that's why the new version of Shakespeare I'm developing tries to move as much of the application logic as possible to run within your browser instead of having it run on my server.
[00:45:03] ODELL:
Makes sense.
[00:45:04] Alex Gleason:
Yeah.
[00:45:05] ODELL:
So, like, it's really running you're trying to offload as much of it locally in browser. Exactly. And the the the web page is just serving me that local code or whatever.
[00:45:16] Alex Gleason:
Yep. And and something I'm really excited about of the sort of But you could also charge for that. Right? Like Yeah.
[00:45:26] ODELL:
And, like, I would happily pay to have it. I mean, because you it's an interesting little problem. It's an interesting problem, right, because you add so much friction if you were making me actually go and set up hosting myself to host the web app. Even though if if it is any good, like, eventually, someone would. But, like, to just have it running automatically is fucking great. I think people would pay for that privilege.
[00:45:49] Alex Gleason:
It's true. It's a good point.
[00:45:53] ODELL:
You can make a decent margin on it too.
[00:45:56] Alex Gleason:
Yeah. We've been we've been kind of struggling with the with the finances of it because of the fact that Claude is so expensive, and we're hoping that with by changing the model to something that's a little bit more efficient, it's gonna allow us to not be negative on this, basically.
[00:46:12] ODELL:
But that's what I'm saying. Like Yeah. The margins are tight on the actual Right. LLM side. But on the, like, the web hosting side, the, like, the short link that you a good idea. Is actually probably relatively cheap. But to the user, it's quite valuable because it's just awesome to just immediately have a URL you can point people to.
[00:46:36] Alex Gleason:
I think you have a point. Yeah.
[00:46:39] ODELL:
Business on air.
[00:46:41] Alex Gleason:
Thank you, man. Update. Financial advice.
[00:46:45] ODELL:
I'm trying to have it update my app now. I want four images in it. I should have done something more crazy. Like, what I just have I have such a simple mind.
[00:46:57] Alex Gleason:
Okay. Here's the app. Someone asked me to You can always just make new prompts and new, you know, new tabs. And, you know, it's good to just to challenge the AI to see if there's things that you think that maybe it can't do and just to see see if it can do it. You know?
[00:47:17] ODELL:
You know, I feel like this is a kind of I've been, like, a little bit distracted this episode because we had the live chat go down. But this might be one of the first times someone's vibe coded an app on a live show. So I apologize with the audio to the audio listeners, but we might have just made history here. Hell, yeah. Oh, okay. Look. I got my new my new app just loaded. So four images is better. It was two it was two full screen before.
[00:47:57] Alex Gleason:
Now you're addicted.
[00:47:59] ODELL:
You know it was the pro move, by the way? On my second time I did the prompt, I said only people I follow because we're live, and we didn't want any accidental dick pics. I think,
[00:48:11] Alex Gleason:
it's a good tool because I've seen some I was a bit worried because I've seen some some content in these Instagram style feeds that Yeah. It's like what Dorsey what Dorsey made or whatever, which was just like the
[00:48:25] ODELL:
Fire hose? Like the schizophrenic fire hose where it's just, like, constantly loading new images from everybody? There was a lot of questionable stuff in there.
[00:48:35] Alex Gleason:
Yep.
[00:48:37] ODELL:
But, yeah, I I saved myself by doing the I saved us by doing the let's see. What happens when I click? You can click full screen. Wow. That's pretty fucking cool, dude. Well done. Shout out to you and your team. Thank you. And shout out to my followers for having good images.
[00:49:00] Alex Gleason:
But this this is the thing now. Like, back in the day, it was so hard to build things. I mean, it kind of still is. Be people haven't really realized this is something that you can do completely yet. I think that we're still early on this. But now, like, now ideas matter more than ever. It used to be execution was all all that matters. Ideas are a dime a dozen. Oh, you have some great brilliant idea. Who cares? It only matters if you implement it. Now it's becoming a thing where implementing it is becoming a given that you can just expect because an AI can do it. So it's really moving to a world where the people with the best ideas are the ones who can do the most.
[00:49:49] ODELL:
I, I agree with that. And, also, yeah, I mean, I'm full of shit. Like, there's no way I'm the first person to vibe live. So, you couldn't let me have it. Ski, I see you. Or Skye, I don't know how to pronounce your name. It's like it's like vlog. It's too confusing. Freaks, do you have any questions for Alex? Because I'm a little bit lost here. I'm tired, and I'm now my head's just running on what I'm gonna vibe next. But if you have any questions, put it in the live chat.
[00:50:20] Alex Gleason:
Welcome to the party.
[00:50:23] ODELL:
It does, but then once you actually so, like, on that thread, once you you can always get, like, the MVP executed on an idea. But then once that happens, you either need to be a developer, like, bring in someone who knows what the fuck they're doing if you actually want, like, to scale it. Right?
[00:50:47] Alex Gleason:
So it depends. Like, it it depends on the scope of the project. So, yeah, scale is the right word because right now, there are limits to this for sure. But, like, the limit is a lot further than you might think. Like, for example, BitChat, I'm pretty sure Jack just completely vibe coded that. I mean, I'm I think there's also a lot of back and forth of talking to the agent and, like, understanding things about how, you know, Bluetooth mesh networks actually work. But if you if you're willing to learn along the way and you're willing to have some technical understanding of what's happening, I don't think that you necessarily have to touch code.
[00:51:37] ODELL:
I mean, the coolest part about BitChat to me was he vibed a native iOS app, and then Cali vibed a native Yeah. Android app from that. It's so It's like the cross platform capability. I think that is, like, one of the single coolest examples. I mean, how many times it's always been such a frustrating thing for me as someone who has done a lot of work on the education side is, apps that are just stuck on a single platform. Like, it's really hard to recommend, for instance, if you're, like, use on if oh, like, first, you have to ask, like, what device you're using. It's It's like, oh, you're on iPhone? Like, use Nasdr or use Domus or whatever. Oh, you're on Android? You have Amethyst available to you. And then and we see this in the Bitcoin world a million times over. And when you just have, like, one app, when you can just say use Phoenix Wallet, and it doesn't matter if they're if they're on Android or iOS, it's such an easier recommendation.
[00:52:41] Alex Gleason:
Yeah. So, I mean, it's gonna change it's gonna change everything. It's changing the whole game.
[00:52:48] ODELL:
I love it.
[00:52:51] Alex Gleason:
I see a question. I saw a question, and I got distraction. Yeah. I was there's there's a few people I wanna shout out, actually. Okay. Go for it. Doing stuff. So so Fran's app has been building Purple Stack. And Purple Stack is similar to MK Stack, but it's for Android. And that's something that is really exciting too. It's kind of related to what we're talking about with, you know, Cali being able to Vibe code an Android app. But but having all of that context there for no store clients, it just makes it a lot easier to do things on Android. And so, you know, he's kind of followed my methodology that I've set out for for building MKStack for the web and then applied that to Android. And and there's there's opportunities for other spaces too like iOS or, you know, maybe just something that's a more specific use case that we may wish for it to be customized in various different ways, like a kind one, you know, sort of client stack that then could be customized various different ways. So what he's doing is really cool. Check out Purple Stack as well to see, you know, other ways that you can build stuff with a similar methodology.
And then, I saw Hazard earlier. He he had mentioned something about a service worker thing I could reuse. But in in Shakespeare, in the original version of it, it's using it was using Nsight. And I'm gonna be moving back to that in the v two where, people can deploy stuff directly from their browser, and then it it gets uploaded to Blossom servers. And so that means that you can Vibe code like a web app in one of these AI things, and then you can deploy it to Blossom servers. And then a Blossom gateway would be able to serve it from in a censorship resistant way by scanning across multiple Blossom servers,
[00:54:42] ODELL:
which is, like, one of the that actual hosting aspect. Yeah. But what do you do about the actual, like, web URL?
[00:54:52] Alex Gleason:
Yeah. So that would be that would be like a you'd have an NPUB and the subdomain is pretty ugly, but then I've been thinking about designing some sort of system where you can create, like, a a DNS over no sort of thing where you can then map subdomain labels to, end site end pubs, and then you'd be able to just you'd be able to support custom domains that way.
[00:55:18] ODELL:
But still, like, the proxy goes down, then you can't access the app anyway even though it's all hosted. There are multiple Nsight proxies.
[00:55:26] Alex Gleason:
Got it. Gateways gateway is the correct term. And then with the, like, a NNS thing, then you could map to multiple domains to multiple, insight gateways. So that's really cool of a thing that exists that we're gonna be leaning into a little bit more in the future.
[00:55:46] ODELL:
That's kind of hard to imagine, but seems like it could be massive.
[00:55:53] Alex Gleason:
Yep. And so we've already been experimenting with that. If you run NPM run deploy in an MK Stack site, it will upload it to an N site, and you'll be able to just share an end site link. And it's completely permissionless. It's amazing.
[00:56:09] ODELL:
That's awesome. We also need more Blossom servers. Right? Like, we just don't have that money right now.
[00:56:15] Alex Gleason:
So I was thinking maybe in terms of a name. Like, a paid subscription type of model, maybe on a specific Blossom server is a good place to put that. I mean, that could actually
[00:56:29] ODELL:
I mean, I know Blossom was, like, originally envisioned for, like, video and photo. But and, I mean, one of the big issues with video So Yeah. Any file. But, like, the one of the issues with video is, like, videos are so large. Like, this payload would be significantly smaller.
[00:56:47] Alex Gleason:
Absolutely.
[00:56:48] ODELL:
Like, how big is that app I created, like, how big is that app?
[00:56:53] Alex Gleason:
I would say two megabytes max. Right. Probably closer to one megabyte, but also it's split across three or four files. So probably each file is, like, 500 kilobytes or less. So, like, this you know, the size of a photo, size of a JPEG. Yeah. That's that's pretty fucking cool. And so then it's and the payload's so lightweight because most things are just running inside the browser locally anyway. Yep. Right. And so then from so if we move all the files in the browser, which, by the way, this brings me to I see Dan Conway, dev, is in this chat too. And he's been working on amazing things with decentralized Git, and that's that's the thing I'm most excited about of the next version of Shakespeare is the deep Gnoster Git integrations because here's how I imagine the user experience being.
You go on to Shakespeare. You sign in with your Noester account. You Vibe code something, and then you click, like, okay. I'm done. And then it pushes that to a Noester Git repository completely permissionlessly. You don't have to do anything else. All you need is a Noester account because that's how grasp works is that you have a Noester account, and then you can immediately push code to get. And so now other people can collaborate with you on that code. Now when people go to your VibeCoded website, imagine there's a edit with Shakespeare button on that website. They click on that button. It opens Shakespeare.
It then clones that Gnost or Git repository into your browser, and then you can immediately start, like, writing a patch for it. And then there can be a button that you click again that says contribute this patch back, and then it makes, you know, a Noester merge request. And then then, like, the original person can review it and merge it in, and then deploy it. And so I kinda like, this is potentially a replacement of GitHub, you know, in a way. It's like it's an but it's starting with the AI rather than starting with the repositories. And and so between this and Gnostr Hub and Git workshop, like, you you have this this full spectrum being covered of, you know, vibe coding to traditional development and collaboration over Gnostr because Gnostr supplies the social component and the social layer that is just so powerful.
[00:59:19] ODELL:
And then you have payments built in too.
[00:59:21] Alex Gleason:
Exactly. And and we have we have Lightning integration directly into MKStack, thanks to Chad. And so now when you just if you say, like, add Lightning to this, then it's gonna be able to just work.
[00:59:36] ODELL:
Well, what does that add?
[00:59:39] Alex Gleason:
So it it has a, like, additions to the context that explains to the AI how to implement these features, and it also has files that are baked into the stack that just make it really easy to have, like, a Zap button. And so then if you you can just click a Zap button, and it'll open like a modal window that'll say it just looks like primal Zap, window where you can just, you know, zap an amount on a particular event.
[01:00:07] ODELL:
Oh, that's awesome. Oh, yeah. So I could have just been, like, add do I say add lightning or do I say add zaps? I think either should work. I'll play around with it. I'm making a new app right now. And this one's a more difficult prompt. I don't know if it's gonna work or not. Okay. This is awesome. I feel like we're at, like, the precipice here or something that could be just a complete game changer. I feel like my mind has been blown a little bit. It feels like it's kinda all coming together. Good. It'd be like open source, open protocol, renaissance.
[01:00:44] Alex Gleason:
Yeah. Exactly. Renaissance is the word I've been thinking of a lot lately.
[01:00:49] ODELL:
Especially when you add the git piece
[01:00:51] Alex Gleason:
because then everyone can collaborate with each other. Yeah. Like like, everyone who's not a developer can just be like, oh, I don't like this of this particular client. Like, edit with Shakespeare. Boom. And
[01:01:03] ODELL:
yeah. It's gonna be kind of annoying to the to the actual developers
[01:01:06] Alex Gleason:
to a little bit. But But also, it'll be so easy to fork and run your own because you can just ignore it if they want to. And then you'll have your own custom URL. And and Hazard says c name resolution is already a thing in some end site gateway. So we're already doing it. Amazing. Thank you. I'm glad someone else is doing it so I don't have to do it.
[01:01:24] ODELL:
So that, yes. Yeah. I agree with none of your business. Nobody can come close to figuring out the implications of what was just said. My head hurts already. And I feel like it, like, really flips the script a bit in terms of it. It puts a lot of advantage to open source over closed stuff, to go all the way back to our earlier conversation about closed proprietary shit. I mean, you could just have, like, millions of people working off of each other and just shipping shit live with very little security implications because you're not handling private key material or anything sensitive. No API calls. You don't have to deal with any of that shit.
You don't have to deal with hosting. You just reckon Vibe and just see what happens.
[01:02:19] Alex Gleason:
I think this is the way Nostra wins, and so we've gotta embrace this this path hard.
[01:02:24] ODELL:
I think you're right. It's either that or v logs. It's one or the other. We'll see what happens. Okay. This is awesome. I'm just gonna I'm I now I'll actually, I'm guilty of I tried to do it the other way, and I hit, like, a single piece of friction. And then, like, my kid was crying or some shit, and I just never I had never continued. But this is, like, super addicting that I could just do it in the fucking browser window. I'm just gonna spend so much money on, once you get your lightning
[01:02:59] Alex Gleason:
lightning up. I feel like you're the greatest person to be doing it too. I'm really excited to see what you build. Well, I don't I'm not sure about that, but we'll see what happens.
[01:03:08] ODELL:
I well, while I have you here, I'm trying to this is, like, kind of an evil I gave it, like, kind of a semi evil prompt. Build me a Nostril web app that shows the location of photos for my followers on a map view. I feel like the map part might be the most difficult for it to handle.
[01:03:34] Alex Gleason:
Well, like, I know that it's been done before. For example, the Treasures app by Chad has, like, a geocaching map in their user. Use, like, OpenStreetMaps
[01:03:43] ODELL:
or something?
[01:03:44] Alex Gleason:
Yeah. And so it should be possible. I I don't know where it's gonna get the location data, though.
[01:03:50] ODELL:
From the photos because they didn't strip their metadata. Alright.
[01:03:54] Alex Gleason:
That I think that step may actually be the hardest part. But if it can get the data, it can put it on a map.
[01:04:01] ODELL:
Everyone strip the metadata from your photos before you upload them anywhere. If you're if you're uploading on Primal, it should strip the photo metadata for you. It definitely used to. Primal's open source. You can check the code, but it's supposed to strip the data. An easy way to make sure you have all data stripped is if you paste it in signal and then download from signal, it strips all metadata automatically. It's also a very easy way to block out faces and stuff before you post. Signal has, like, some limited photo editing tools in there that I like because it's just convenient.
[01:04:44] Alex Gleason:
I'll put in, like, IDs to to Noister.
[01:04:49] ODELL:
I dude, I I'll tell you, like, that's probably one of the scariest parts, especially with so, like, now that a lot of these apps have Blossom integration, and, like, do you have I, like, have, like, Blossom Mirror setup or whatever, it's, like, as soon as I upload something, it just, like, immediately gets mirrored to, like, a bunch of different servers where I can't remove and is, like, signed, hashed, and I can't easily remove it. It's, like, pretty scary. Yeah. Kinda set up. Feel you. But I don't know. I I've been thinking about, like, do you have any thoughts on that? Like, from the primal side,
[01:05:24] Alex Gleason:
we wanted to just add Blossom and offer that and all the people always say delete like, deletions are not real deletions on NoServ, but but, like, prove it. Because anytime we've tested whether deletions work, they do work, at least for notes. So I don't know I don't know if anyone has tested it on Blossom servers, but in terms of, like, is it possible in a decentralized way to do deletions, the answer is definitely yes.
[01:05:50] ODELL:
Well, it's just the question is if they are the question is if the servers honor it or not. Right. Exactly. It would be hazardous. I know the I know prime primal servers, like, primal's blossom and, the relay and the caching service all honor deletes. Yeah. I mean But I also know, like, UTXO relays. UTXO, the webmaster, he supposedly made a relay that, like, only saves things that have delete requests. So, like, if you send out a delete request, like, he automatically saves it. Alright. And it's one of the few things he hasn't released open source because he's, like, doesn't wanna be the dude that does it, but someone's gonna do it. Right?
[01:06:32] Alex Gleason:
Well, don't stop listening to that relay and don't read from that relay, I guess. And then everything's good.
[01:06:39] ODELL:
Okay. Fair enough. Well, I'm gonna who knows how long this app's gonna take to Vibe? I'm glad we got a successful demo. I'm glad you got to be here for my first. It was a pleasure having you back. I will say that, the top zap from the last rip we did, what did I say? CD one sixty four, which, by the way, freaks, I think, was a better conversation than this because we started from the top Yeah. Rather than being, like, an update episode. Beholder said one of the most interesting guests on dispatch would love to see him join more often, and I second that. So let's try and do this every two and three two to three months. We'll do, AI vibe, Nostra update. I love hearing from you, so it's a win win.
[01:07:28] Alex Gleason:
Next, we'll be hearing from you because you're gonna build the future of Nostra with this. Yeah. I'm gonna get you some inbound liquidity.
[01:07:36] ODELL:
Let's get your lightning node up and going. I will say, I don't know if you want like, this is also kinda like evil empire shit. It's great that you're running your own node, and you're getting sats. If you wanna just hold on to the sats, then that's probably the best path. But if you do if you are trying to convert it all into dollars to pay your bills and shit, like, the strike API is a good path. Very easy to implement. Yeah. But disclosure, largest investor in strike, and, I sit on the board. So, but we intentionally made it's if it's if you want dollars. If you want you have to KYC, but if you want dollars, it can That's good to know. May maybe in a certain point,
[01:08:20] Alex Gleason:
like, maybe maybe yeah.
[01:08:24] ODELL:
Maybe it makes sense. You don't have and you don't have to worry about liquidity or anything like that.
[01:08:29] Alex Gleason:
Yeah. Because converting, is a pain at a certain point when when I do have to keep paying the, you know, AI providers in dollars.
[01:08:38] ODELL:
Let's see if my evil app worked. I spent $8.94 for this. View photo map. I think there's, like I I don't think this is gonna work.
[01:08:56] Alex Gleason:
I think it's gonna display a map, but I wonder if it's gonna be able to extract any data from any of these images.
[01:09:08] ODELL:
Like, these exact coordinates. Right?
[01:09:11] Alex Gleason:
Oh, shit. Okay.
[01:09:15] ODELL:
Now this now I kinda feel bad. Photo do lunchtime, and I have exact coordinates from lunchtime. Delaware. For strip your metadata. Anyway, this is fucking cool, though. We live in a brave new world. Alex, I'll post a link to your notes so people give inbound liquidity. Shakespeare is at shakespeare.diy. We have, what is it, soapbox.pub/tools is all your other tools that you guys have been building. I'll link those all in the show notes. You have any final thoughts for the Freaks before we wrap?
[01:09:56] Alex Gleason:
Build the future with AI on Noister.
[01:09:59] ODELL:
I love it. Freaks, I hope you enjoyed this rip. I know it's a little bit different. We're just doing things live. I was a little bit distracted, but I hope you enjoyed it. If you like the show, as always, best way to support it, is actually not with Bitcoin. It's sharing with your friends and family. But Bitcoin donations are appreciated. Interacting with the show, whether that's on Noster or podcasting two point o apps, leaving comments, leaving feedback, leaving suggestions, all that is awesome. All links at syllabusbatch.com. All my links are at odell.xyz.
I'm taking next week off. I'm traveling for this work thing, or this Bitcoin thing. That should be pretty good. But as a result, I'm skipping a live show next week, but we'll be back the week after that. Alex, thank you again. Thank you. Freaks. Stay on the Stack Sats. Love you all. Peace.
Treasury Secretary Bessent Intro
Happy Bitcoin Friday
AI and Freedom Online
Shakespeare: Vibe Coding Made Simple
Concerns About Big AI
Self Hosting AI and Technical Challenges
Energy and AI Development
Building Personalized Experiences with AI
Nostr's Future and Mainstream Adoption
Decentralized Hosting and Shakespeare's Future
Collaborative Development with Nostr Git
Open Source Renaissance and Future Prospects