minutes14 March 2024

Tech Is Neutral: How You Use It Is Not | Logos Podcast with Jarrad Hope

Logos Podcast
technology
privacy
application
politics
Share
Apple PodcastsSpotify

In a Logos Press Engine crossover, Jarrad Hope brings Corey Petty - Logos Program Lead & Hashing It Out host - to discuss the political and technological motivations behind their work and their aim of building infrastructure that supports civil liberties and resists centralisation of power, using open-source tech.They question if the nature of technology is inherently neutral or not, the ethics of building technology, and how it relates to political structures. 

[00:00:02]

Corey Petty

I'm a good portion of the technology building side of this, and I'd say you're more of the political motivation ethics. This is why we're doing this. This is the problem that we see. Uh, instead of doing this, would you think that's reasonable?

[00:00:20]

Jarrad Hope

Yeah, definitely. I mean. Definitely some overlap there, but yeah.

[00:00:24]

Corey Petty

Yeah. Jared. So me and you talk quite a bit and the introduction of logos or the thought of how we like how this thing gets built. This, this you you mentioned the concept of politics quite a bit and why this whole logos thing has to be political. Can you kind of go over that a little bit as to like, why, why, why does politics need to be involved in the creation of, of logos or the creation of technology in the first place?

[00:00:54]

Jarrad Hope

Yeah. I mean, it's a good question. Um. I think like. It depends on how far you you really want to go and which thread you want to go down. Right. So typically, um. This kind of like two main threads that I consider. One is sort of like the post nine over 11 world, um, which kind of came into, uh, sorry, the post 9/11 world where like, you started to see these sort of mass surveillance programs come out. Right. So I think it was like in roughly the early 2000, maybe 2004, where, like Thomas Tamm and Thomas Drake, uh, were effectively like these whistleblowers that um, uh, revealed Stellarwind, which was this sort of one of the first sort of foundational dragnet surveillance, um, spy operations against U.S. citizens, right. And did massive data collection, um, and, uh, you know, this continued or paved the sort of legal foundation for, you know, the Snowden revelations, which came out almost like, what, ten, 12 years later, uh, which was the NSA spy program Prism. Um, and that combined with, say, the Patriot Act basically gave the US carte blanche to collect all data on all people across the entire planet. Um, and those things are still ongoing. Um, another side of that is perhaps, you know, uh, around a similar sort of time, uh, around sort of 2010, 2009, um, and 2011, the sort of rise of like, you know, the Occupy and Bitcoin, um, in which I view both of those as a form of protest for economic freedom and against the sort of financial system or the sort of so-called 1%, um, or um, as you would hear in the Bitcoin community, this notion of like debt slavery, right.

[00:02:44]

Jarrad Hope

Um, um, through inflation of a fiat currency to the point where, like, your money and purchasing power is not worth anything. Um. So like. You know, one of the issues with that? Well, one of the issues with, say, the surveillance side of things is that, um. You ended up getting into a position where like. Now this data is collected on you. Um. You might be concerned about your privacy, right? Um, but, you know, one of the arguments against that might be saying, well, I've got nothing to hide. And that's like, okay, but there's a broader problem there. And it's, you know, it can be, you know, one way to look at that is, is like how you can potentially be manipulated as a citizen or how does that allow a democracy to, to function. Right. Um, and I think Glenn Greenwald kind of said something to the effect of it's like, if I know every single thing about you, about what you think, how you reason and what your fears are and like, what you're planning and even what you're doing, and you know nothing about me because I'm shielding my behavior behind, like, a wall of secrecy. Uh, then, like, I have a massive amount of power over you, and it's not just you, right? It's the entire population we're talking about. So the sort of power imbalance created, uh, by this sort of technological gap, um, allows a very small minority to influence, uh, human behavior en masse. Uh, and like today, we're starting to see that in, like, censorship of, like, big tech platforms or big tech platforms censoring, uh, individuals.

[00:04:27]

Jarrad Hope

Uh, and so this starts, uh, suppressing sort of political, uh, political dissidents. Um, anyone who doesn't agree with like an official narrative, um, and say like, um, when it contrasting to say like Occupy and Bitcoin, even though they're fundamentally protesting a similar thing. Uh, you. You. You essentially are a kind of, um, what you can view both bitcoin and, and occupy under sort of Albert Hirschman's notion of exit voice and loyalty. Right. So, um, Albert Hirschman had this sort of ultimatum that you can apply, uh, he was using it for consumer products, but you could apply that with your relationship to the states. Uh, you can either, uh, be loyal to the system, like everything's fine. Uh, you can express your voice or discontent. Um, that's like through protest. Um, or you can exit the system, um, which Bitcoin was trying to do by creating a new, uh, currency, essentially, um, that has better sort of foundational properties, um, that support sort of economic freedom. Um, and has a different monetary policy that is supported by, by the people. So. When it comes to say something like logos. Um. It's really sort of an instantiation of some of the things, ideas that Ethereum got right in the early days. Right? Like, um, this notion of like decentralized applications. Uh, but. They were typically trying to be sort of agnostic towards why this technology actually matters, right? It was more focused on a world computer. Um, but there is actually really strong sort of political reasons why this matters. I don't know if you want to get into that, but like, it's a whole other thing as well.

[00:06:19]

Corey Petty

Yeah. That's this. There's this like, I want to get into this idea that eventually tech is neutral, but there's a politics associated with building it and a politics or ethics associated with using it. And those are different. But I like this. I like I want you to keep going with with. This concept of why? Where did where did Ethereum? Where the current. Blockchains failing because the whole we got it. I got into this personally because of this, um, removal of the human from the system in a lot of ways. Right. Removing the ability for people to take, uh, asymmetric power into the system or to have to need to trust someone such that they have this power in order to influence me or the larger group. And and so you make these, I guess, credibly neutral. Infrastructure for people to use and rely upon that isn't subject to to to human greed and power dynamics. Why are the projects that currently in existence don't don't give you this. Like what's where are they? Why are they going bad like or like why are they going in the wrong direction? Maybe.

[00:07:26]

Jarrad Hope

Yeah. I mean, it's hard to say whether they're like going in a wrong direction, um, per se, like. But. I think like one of the ways to view that is, is like, okay, there's a set of like rights, right, that I particularly care about. Um, and that that sort of subset is, uh, civil liberties and civil liberties are usually, um, around safeguarding against tyranny. Right. So you're talking about things like, um, you know, uh, freedom of speech, like, and conversely, freedom of thought and freedom of press, uh, the right to associate or freedom of assembly, economic freedom. Um, and I also care about, say, self-determination. And there's a whole host of other ones. Right. Um, and. When you start to look at like how some of the other, uh, approaches are upholding these, um, they tend to fail short. Uh, and there's pragmatic reasons for doing that as well. Right. Like, uh. One of the often sort of excuses, I guess, is like this notion of scalability, right, is like in the certain current designs that we've chosen because of historical reasons. Um, the only way to kind of work with this existing design and move it forward is to make concessions on, on certain aspects where I think that, um, we can benefit from a lot of the learnings that happened over the, um, over this space in that time and try to compose them and construct them in a way that has stronger guarantees when it comes to upholding certain civil liberties towards the individual or the users of the network.

[00:09:12]

Corey Petty

Yeah, my talk at Def Con kind of try to point that out of like, we may have made decisions in the past that are making it difficult to move back in that direction now because we've tried to scale like we've optimized for scale so much. And reintroducing things like privacy, specifically in lower layers of the stack is much harder because we've made those decisions a long time ago.

[00:09:36]

Jarrad Hope

Yeah. I mean, I'm kind of curious, like, why? Why you think privacy at that sort of level of the stack is important?

[00:09:44]

Corey Petty

Um, well, uh, the current, uh, you can call it censorship that happens with MeV and OFAC sanctioning. Uh, that came as a consequence of tornado cash being sanctioned by the United States is a really good example here. It it is. Is it smart contracts or privacy built into smart contracts specifically on Ethereum? In this context, um, isn't guaranteed because the ability to censor those transactions and view the details of those transactions is available at lower layers before the smart contract execution even haven happens. So validators who are looking through these transactions and ordering them are able to picture like, oh, this is a cash transaction, this person is involved with it and it's going to this person. Uh, let's just not include it at all. Right? So the privacy that you would get from the smart contract execution is completely negated by the lack of privacy in a lower level. Right. So and this is this broader idea that it's impossible to build private systems on top of unprivate systems in a lot of ways, which is exactly what the internet has been trying to do for a long period of time with the introduction of encryption in transit. Uh, there's still a lot of metadata associated with those encrypted packets based on the routing of those packets and where they're going. And if you just drop them, then you've mitigated the whole privacy situation in the first place. And I think you need to reintroduce it earlier, such that people can reveal themselves in the place in which they are trying to reveal themselves, as opposed to being constrained to a more narrow system at the lower levels.

[00:11:23]

Jarrad Hope

Yeah. No, absolutely. Like, I mean, the tornado cash, uh, incident, um, definitely kind of spelled out, um, some of the issues, um. With our ideology on how we, we conceive, you know, blockchains, right? Like in Bitcoin. What was so amazing about it is it could secure value or even Bitcoin scripts or smart contracts in the case of Ethereum in a hostile environment. Um, yet now, uh, the response to tornado cash and, you know, um, led to new forms of attacks against these networks. Uh, and as you said, like in, in these sort of lower layers, um, that's where this coercive ability arises. Right? So if you're able to identify, uh, the trans actor, if you're able to identify, uh, the people, the nodes or the people who secure the network, um, or participate in the network in any way, um, then you can coerce them. Um, and if you can coerce them, uh, as we saw in Tornado Cash, we started seeing, as you said, like these, uh, blocks not being relayed or being censored. Um, and while there is enough distribution around the entire planet for different nodes and like, ultimately a lot of those transactions still made it into the chain, you it's not hard to imagine a future scenario where one of those attacks is more complete. Let's say, um, say, for example, you know, the Financial Action Task Force, KYC, AML is largely implemented in, like most countries, uh, so you can imagine, um, a similar thing happening on that front. Um. But more importantly is that it's it undermines this notion of like neutrality in these networks. Like once you can coerce somebody in that, in that system, um, you can undermine how that system behaves. Um, and when it comes to or why neutrality matters is because it's a prerequisite for a lot of these civil liberties. Right? As soon as you have an intermediate experience, um, intermediate experience. And, um. That Intermediator no longer agrees with the say speech that's going through it, or the associations being made in it. Uh, they can have an opinion on that, um, which comes in the form of censorship.

[00:13:51]

Corey Petty

Um, yeah. This is this is something. Oh. No, I like my microphone for a second. Yeah. This is, um. This this bleeds into this next conversation. But to to add to what you're just speaking about, like if you'd like to make a credibly neutral infrastructure for everyone to operate on, then it can't be subject to the bias of nation states. Like we're trying to build infrastructure for the world. And so the bias of nation states can't be below. That infrastructure. So you need and that's the way it currently is, right? If if the opinions of a given jurisdiction can influence the operation of the infrastructure the world uses to transact or congregate digitally, uh, whether that be value or socially like, then we don't have that, uh, infrastructure inversion that we need. We need it. Right? We need it to be flipped such that the given jurisdictions operate on a common infrastructure, and they're able to move around but not influence the way in which the infrastructure operates. And that's the main, I think point for me is that we haven't made that inversion of infrastructure yet. And because of that, you're able to see this situations like tornado cash where like, oh, we don't like this. So the whole thing doesn't work this way anymore. As opposed to this opt opt out exit or voice your opinion, right?

[00:15:16]

Jarrad Hope

Yeah. I mean, and then once you start going down that line of thought, um, if we're talking about a system that is no longer biased by, say, nation states, then that's when you start getting into this notion of sovereignty of these networks. Um, and. If it and this notion of like a new polity that could form around this sort of, uh, reframing of blockchains as a voluntary social order, right. Um, that's a whole other topic we can probably go into a bit, but like before we do like, um. It's not clear why. Like, neutrality really matters, right? Um, particularly when it comes to, let's say, institutions or the operation of a network like this. And, you know, you can certainly make arguments on, on sort of civil liberties grounds. But perhaps another clear recent example is like the weaponization of the US dollar. Right. Um, after World War two, uh, the United States was in an amazing position for securing peace around the world. Um, and consequently, like the US dollar also became basically the de facto world reserve currency, uh, for international trade because it could be secured. Um, yet, uh, these sort of actions done by the US both in terms of how they've, you know, printed their money, but also how they've effectively used that to, uh, police transactions around the world, say, like economic sanctions on other nation states. Um. Is kind of useful to a certain extent if you have like a large buy in, but if you take it too far, then it that institution or that you know that currency starts to lose value because it is no longer, uh, impartial, uh, and so it no longer becomes a great store of value or medium of exchange, because you may get robbed at any point effectively.

[00:17:11]

Jarrad Hope

Right. Um, so then you have, you know, say bricks now forming around the Chinese yuan. Um, and I don't necessarily agree with that being a better currency, but you've kind of lost this position or this, this capacity for different nation states in an anarcho society to be able to trustlessly trustlessly, um, transact between each other in a secure fashion, um, which then leads to a whole other host of issues in terms of financial trade. So if you could create a, um, say, a currency that is, say, politically neutral, um, it has all the properties of sound money, including fungibility, which is where privacy also comes in. Um. Then you effectively create an institution that is valuable for everybody. Um, but then if you're doing it in a system like we are, then you can expand the scope of that to almost any institution that can be represented by, say, a smart contract or a policy that is running on chain, um, which is going to be tremendous, uh, in terms of unlocking value, um, for anyone who wants to be involved in that system.

[00:18:28]

Corey Petty

Yeah. This is the thing that this, this this part interests me really a lot. And that is, is tech neutral, like, is the underlying technology. Neutral or is technically in general neutral. And this argument of, like the ethics of building technology is like more often than not, when people hear about the kind of work that we do, um, and that we're, we have a strong focus on privacy. Uh, and we are mitigating or keeping out. Current people's ability to identify illicit finance by adding privacy and lower layers of the stack. Uh, they asked me like, you're you're enabling a lot of bad actors. A lot of ways. And how do you feel about that? How do you how do you justify building technology that enables bad activity? And more often than not, I, I say, uh, like one. It's a balance. Like you need to keep that in mind. Like you should be ethically aware of what you're enabling. Um, there are also a tremendous amount of good things that we're enabling by adding privacy, because you're giving people a lot of opportunity that they currently don't have. You can go into that in a moment. But like this, like, how do you view the ethics of building technology and whether or not technology in itself is neutral?

[00:19:46]

Jarrad Hope

Yeah. I mean. It's a good question and I think it's a valid point as well. Right? Um, but. One way to view this is that the universe itself, or like the reality that we live in, is fairly agnostic to our own actions. Um, it allows anything to happen and then more within, you know, physical constraints. Uh, and that allows for sort of bad actors to, you know, quote unquote bad actors to do something. But that's in of itself, you know, imposing good or bad as a matter of perspective. Um, and the closer that we can kind of get to that reality for allowing ourselves to create, um, our sort of artificial constructions on top of, uh, the better, right? Um, in terms of human humanity as a superorganism. Um, going in whatever trajectory we want the civilization to go in right now, that doesn't prevent, uh, at the same time, like. This kind of an issue with the, you know, modern nation? There's a lot of issues with modern nation states, but one of them is this sort of a rising of an order, um, in conjunction with a government and its law. Right. Like all of these were effectively monolithic and bundled, bundled together. Um, whereas I think you can actually, uh, tease out the sort of order aspect of that, uh, while having a plurality of different governments on top of that. Um, and also, uh, you know, what, what is a notion of autocentric law, right? So like a plurality of different sort of courts or however you want to conduct yourself. Um. But then on the sort of like, you know, notion of like terrorism per se, like. I forget the exact quote by Glenn Greenwald, but he was saying something to the effect that, you know, like terrorism is is like a tomb that, you know, means nothing but like, uh, induces fear in everybody.

[00:21:50]

Jarrad Hope

Um, and like, he was basically talking about, like, justifying, uh. Deceptive actions or or, um, achieving consent around a public policy, uh, by invoking this notion of terrorism. Uh, but even in terrorism, then like, um. The the amount of deaths that are a result of terrorism is like you're more likely to be killed by a vending machine or something, right? Um. And the the even notion of terrorism takes is takes on the perspective of who is actually, um, calling somebody a terrorist. Right? Whereas like that other side of the, um, that other side has real reasons of why they are doing whatever they're doing. Um, and I think it's more of a social problem that needs to be addressed, uh, rather than something that you can embed in the technology. Um, maybe more concrete example of that is they like, uh, attempts at censorship, like you can knock out somebody's sort of social network, but they can always create a new account. Um, there's always ways to kind of circumvent it. So there's sort of like suppression or regulation only creates externalities to, to governance. Um, and if you can kind of bring everybody on the same page, at least that kind of allows you to form agreements, at least the option to form agreements with actors that you might not agree with. Um. And and maybe that's not a satisfactory answer. And I think we definitely need to spend more time working on that. Um, but I, I generally look at humanity as a whole rather than when it comes to this specific part. Um, rather than Typekit perspective.

[00:23:32]

Corey Petty

Yeah. This is this is off to most typically my answer to it's, it's um, so to lay out my. Personal beliefs of the ethics on technology or ethics of building technology is that technology itself is a tool and is inherently neutral. It does have constraints on how it can be used, which are usually shaped by the people who build it in the first place. Because there is an underlying motive or politics or ethics associated with. The technology being built like we are out to build this thing for this reason, to serve this purpose right? That in itself has a level of bias, and we'd like it used in this way in order to keep people, in order to make sure it's used this way, we try and detract these other use cases outside of it. Um, that being said, it's you can't always force. A specific use case, especially in our in our circumstances, if you're optimizing for inclusion, you want to involve as many people as possible with as many opinions on how to use the technology as possible. And so. Like you have this up front ethics and motivation for building technology, which shapes the technology itself. Uh, what we're trying to do in our ethics is very much optimized for inclusion and censorship resistant and like, asymmetric power dynamics. So like, few people can't control the large amount, which is what we have today.

[00:25:05]

Corey Petty

Uh, when you do that on the other side, there's going to be a lot of people who use the technology in ways in which you don't want them to and or like you didn't set out to do, because you're optimizing for inclusion in a lot of ways. And my rebuttal for people who say you're enabling a lot of bad people or bad actions that you can't stop because you can't censor them or their their actions are not transparent is. The technology will be built for sure. Like the cat's out of the bag. We have privacy technology. Zero knowledge, like cryptography is like in full swing, growing quickly. Um, quality, scalable consensus is being developed such that we can use it on a world scale like this stuff will continue to grow and the cat's out of the bag. I would rather the people with strong ethics and morals build it and be the experts in how they work. Then those with these more quote unquote maligned desires specifically because when you'd like to root those people out, you're able to help. Work with, uh, like, you know. Governments are people who want to do this with you in such a way where you say, these options are no longer available to you.

[00:26:28]

Corey Petty

And help figure out alternative ways to get those people to stop doing these actions. When you when you agree with them, then provide back doors, because we've all seen back doors and things like encryption just simply don't work like it ruins the entire. Um, system in itself. And. That's always been a very difficult thing for people to understand is that these things will grow. This technology will remove the current ridiculous power asymmetry in the internet today. What? What, like, uh, law enforcement or people who don't like this activity need to figure out how to do is find alternative means to stop that behavior, to stop relying on the power asymmetry to do it. And so. When people ask me, why do you build this stuff? Is because I'm an ethical. I care about these things. I consider myself a moral person. I'd rather be the expert that can help mitigate activity, then be the be the non-expert with morals, trying to catch up to the bad activity to figure out how it works. And. Also, the problems are interesting to me, but like out of the two situations, which in my opinion, it's not a false dichotomy like you would always want the people with morals to be the experts so that they can react to the situation faster than, than than than not.

[00:27:52]

Jarrad Hope

Yeah, definitely. I mean, the argument, like, I mean, if you kind of zoom out as well, like the the argument doesn't hold either because, um. If you look at like the majority of bad actors, um, far more value flows through politicians, like corrupt politicians in the financial elite. Uh, say like first aid, going to different countries often doesn't get to where it was intended to go, because everyone's taking a slice of that and then hiding it within banks. Um, and these sort of things. Um, plus, you know, you have, uh, intelligence agencies running like mass drug operations, and they're using the US dollar. Um, for doing that, like so. Um, in terms of, like stopping corruption, uh, or, you know, like bad actors, the current system is. In in like I mean it's it's demonstrated ably already here. Like it's it's not actually solved in any way. Um. There are abilities to, you know, there are certainly efforts to to kind of control certain aspects of smaller entities. But when it comes to the big stuff, like all of that still happening. Um, and it comes at the cost of like eroding the civil liberties of almost everyone on the planet.

[00:29:08]

Jarrad Hope

Um, and the other side of this is like, I genuinely think that the value created, um, by this sort of like, new political system that could arise out of this technology, um, is going to be far greater. Um, in terms of like, uh, what we already have today. So, like, there was a paper that came out from the Copenhagen Consensus that was showing that like a mere 0.1% reduction in transaction costs. Um, literally quadruples a country's wealth, right? The difference between, say, like Argentina and Switzerland. Um, there was a, uh, a report that came out by world Bank in 2006, uh, talking about, um, I think where is the Wealth of Nations? Something like this. Um, where we were talking about where it basically outlined that the rule of law and the capacity to socially coordinate, uh, outweighs even extraction of natural resources. Right. So if you can create a technology that allows people to socially coordinate in, um, with, like, say, stable, corruption resistant, uh, censorship resistant institutions.

[00:30:22]

Corey Petty

Hold off and just start off, start over with a car alarm or the ambulance there.

[00:30:28]

Corey Petty

It's very honorable.

[00:30:30]

Jarrad Hope

Ah, yes. I mean, the producer found, uh, the quote that I was talking about from Glenn Greenwald, which was terrorism means nothing other than what the US government wants it to mean at any given moment.

[00:30:41]

Corey Petty

That's kind of this, like, general subjective thought, like it's a it is a it is a weaponized terme. Now, I agree that there is such a thing as. Terrorism, but it's always going to be a subjective thing because I belong to a group and I have a specific set of ideals, and people are out trying to attack those ideals. Right. But that's very much that conversation can be flipped, and it's very subjective. Right. Like you can you can agree with it, but you can only agree with it if you, uh, concede that you are a part of a group with a, with a with an enemy.

[00:31:18]

Jarrad Hope

Right. And like I mean, that also kind of speaks to how like this technology works anyway. Like I don't think that the tech is inherently neutral. Like, you see different ideologies, uh, rising up around, say, Bitcoin and Ethereum. Um, I would say that broad, uh, broadly speaking, Ethereum is, uh, fairly modern liberal, whereas, uh, Bitcoin has a very strong sort of, uh, laissez faire libertarian streak to it. Um, yeah.

[00:31:46]

Corey Petty

But that's not the tech. That's, that's not the tech, that's the that's the group of people who, who wield it.

[00:31:51]

Jarrad Hope

And that's true. But I mean, the people at the end of the day who was writing the code, right. Like.

[00:31:58]

Corey Petty

And that's right. That's the interesting thing here is that, like, you have a set of politics in the, in the development of a piece of technology, and you have the technology itself, which is constrained partially by the people who build it. And then you have this, this cut of the moment the technology exists and is able to be used by anyone or people outside of the people who built it. There's a difference in politics from there because it's wielded in different ways based on who's using it. And there's this kind of there's an influence of how it's built and how it's used, and how its use then changes the way it's built to further course correct into the way in which people think it's supposed to be done. But ultimately the tech itself is, is neutral because you have this break, like if it was continuous and it was only be being able to be used in the way in which it it was supposed to be, then I could probably agree with you that technology is political, but like, ultimately speaking, it's like this point of break between it being built and being used in my opinion, makes it neutral. And you have to think about the groups of people building things and why they build it, and the groups of people who use it and how it gets used. But like, you can never blame the tech meaning like if you go after the technology in order to try and stop the behavior, then you're ultimately doing it wrong. And that's the that's the problem that I have because people focus on the politics of the technology and it being the problem versus the people who are doing it.

[00:33:33]

Jarrad Hope

I mean, that's definitely true. Like, I think that, um, people tend to look at material artifacts and then try and change them. Right? So, you know, the rise of sort of hate speech, the introduction of hate speech laws right now is like they're looking at law as a material tool, yet they're trying to create laws that are inherently subjective. Um, and that doesn't never works because it's completely open to interpretation. Um, however, I find it very hard to separate both of those, right? Like, I guess I take a marshall McLuhan stance, you know, the medium is the message. Or like, you know, we create the technology and then the technology then creates us. And so, um, with anything that we create, we're ultimately tied to it, uh, culturally, sociologically, politically, economically. Um, and there's this kind of back and forth on that front. Uh, but I tend to agree that, like, you know, censoring the technology itself is, um, not going to be viable because it just pushes it, as I said, into like these externalities of governance, right? It goes into a realm where it's no longer, um, under, you know, any kind of guidance. Uh.

[00:34:49]

Corey Petty

Yeah, you put me in a bind here because I've been using that firm for quite a while now, and a lot of my talks. The medium is the message. Uh, I think it's. So like the art we are subject to, the artifacts we create in terms of, um, how we leverage them to make social communications and group together. I don't think the artifacts, the tools are political. It's just the only way we have. These are the things we have in our disposal to express ourselves. And the reason why we're building the technology we're building within, like within logos, is specifically because the tools we think we need don't exist, and we have a politics or ethics associated with building something different because we've identified a problem with what currently exists.

[00:35:36]

Jarrad Hope

Right? I mean, I get what you're saying. Um, I definitely take a different sort of opinion on that. Like, um, ultimately, you know, the technology we create is a calcification of like, the thoughts that went into it. And so in order to, you know, in order to create something that's inherently neutral, then needs to be, um, a form of like, almost like a universal morality, um, or, um, a set of principles that guide that, um. I think a great example of this might be is an EP for 844, like the sort of precursors to dank shouting and proto dank shouting. Um, in Ethereum, where, you know, there was lobbying from large stake interests that, um, ended up allowing for like history on the blockchain to be forgotten after like two weeks or something like this. Right. And this allowed, you know, allows Ethereum to scale in terms of layer twos, uh, substantially more. Right. Um, but that becomes really problematic when you start thinking in terms of like what Jacques Derrida calls, you know, the sort of cultural archive or, you know, the archive, um, in general, um, and the archive matters because if you're able to take control over the archive in a society, uh, then you can effectively or its memory, um, you can rewrite its history and then you can, uh, separate those people from their historical roots and their cultural identity, um, and make new generations think something entirely different. Uh, and so when I see that those, these sort of, you know, and so he kind of wants this to sort of democratize access to, to the archive, right, where anyone has the ability to read and effectively append, uh, to this immutable data store. And that's how I view blockchains and decentralized file storage to a large extent. Um, and once you start chipping away at that, uh, then you open up, um. You open the historical record to a lot more subjective interpretation, which can lead to these, um. Formations of power that end up rewriting an entire society's sort of history and who they think they are. Um.

[00:38:11]

Jarrad Hope

Yeah. Like the technology itself is obviously um. Static in that to a certain extent, but at the same time it's a calcification of the ideas that went into it. Um, and if they're not enforced, it'll ultimately change as well. Right? Like at the end of the day, it's just code. Um.

[00:38:31]

Corey Petty

Yeah. The issue that I have, um, with this is the moment you, um, because there's this, this stark change, this step function, whatever you want to call it, between the ethics of building what goes in, like this calcification of ideals. Right. If you want to call it that. That's. I think that's a good way of putting it. Technology is a calcification of ideals. And then its application, which is different because you can't fully constrain a use of a technology with the ideals you set out to have. It being used is when it's used in a in a manner that. Is considered bad. And you make this distinct connection between its use and its creation, then you're able to justify its bad use to attack the people who are building it.

[00:39:24]

Jarrad Hope

Yeah.

[00:39:25]

Corey Petty

If you have this stark change, it's more difficult to do that. And I think that's very important is you've made a differentiation between, um, the group of people who are out to do something and their motivation and the group of people who are using something for alternative motivations. And that's a very important distinction to make, especially when you look at, um, what's what we've seen happen in with like, you know, throughout the, throughout the history of, of technology used this way, whether it be, uh, nuclear technology, encryption, blockchains, private blockchains or privacy technology. Right. There's there's always this justification of the behavior and trying to mitigate the behavior of its use and not being able to go after specific people because they don't have the information, so they go after the people who create it in the first place, which isn't fair.

[00:40:15]

Jarrad Hope

Yeah. I mean, that's like going after gun manufacturers because those guns were sold by some arms dealer to wherever. And then like some conflict arise where people died. Right.

[00:40:26]

Corey Petty

That's a that's a great analogy. Right.

[00:40:28]

Corey Petty

That's perfect.

[00:40:29]

Jarrad Hope

I mean, but just look at like the crypto wars as well. Right. Like. Ssl and like all of that encryption technology was, you know, severely banned in many ways and had to be printed on t shirts just to, you know, under freedom of speech to kind of get the word out. But now it secures, like almost every web request that you make, and certainly so with your bank, um, or whenever you're making a purchase online. Um, so. The inverse of that would be to expose all of those transactions and make people vulnerable, which is exactly what's happening in blockchains right now. Like everyone's account balance is completely open in these. And like blockchain forensics companies have come up to kind of solve that. And there's an issue with that when it comes to like proprietary algorithms and non-open, um, you know, solutions to trying to trace funds is introduced into law because it just it's basically his or her opinion at that point. Um, but. Like people have an expectation of privacy around their bank account. Um, why wouldn't they have an expectation of privacy for their effective equivalent of a or analogous equivalent of a bank account on on whatever their favorite public program or blockchain is? Um.

[00:41:49]

Corey Petty

Because, yeah, I mean, I definitely agree with that. People don't also realize that they're beholden to their bank account, and they're not beholden to anyone with a blockchain. And so banks don't like that versus people.

[00:42:07]

Corey Petty

Like the SSL.

[00:42:09]

Corey Petty

Or.

[00:42:10]

Corey Petty

Tls implementation of the internet, like you can't have commerce, real commerce unless you provide that privacy. Because there's a bunch of assholes out there and they'll take your money. Especially when you build an open system where access to information is ubiquitous. The ability to do that is much harsher. So the the level of privacy required to actually enable commerce is much greater.

[00:42:37]

Corey Petty

Yeah. So it's like. I mean, we could probably go. Around these topics forever. We we often do. Uh, but I wanted to at least talk about, like, what are we doing? Like, what's what's the point of logos? And what we're setting out to do and like how how was it made up?

[00:42:57]

Jarrad Hope

Yeah, I mean, it's a good question. I think, like logos is an attempt to establish like a new quality for me. Right. Like, um. There are multiple markers that show that, uh, the way that we currently socially organized and, um, is not particularly working very well, uh, say, like the freedom House index, um, is showing that, uh, freedom, um, in all, almost all nation states is steadily declining, uh, certainly over the past 20 years. Um. And at the same time, you know that there's issues with the way that we say do democracy. There's the ways that we do and just nation states in general in terms of like their legitimacy. Uh, so they're supposed to be this notion of consent or consent of the governed. Um, yet? When when you actually dive down into consent. Um, most people haven't expressed, uh, their consent. Um, for governance. Right. So it's like in the US Constitution, you haven't actually signed that. Um, and yeah, you're just kind of born into it. Um, and then there's like notions of sovereign immunity or governmental immunity, which absolves the actual government. Um, exempt from civil liability, like in the case of the United States, like, um, I think it was 1821, uh, Collins versus Virginia. Uh, the Supreme Court ended up implying that the government could not be sued if it didn't consent. Uh, and from then, it has expanded to over 20 million public servants.

[00:44:37]

Jarrad Hope

Um, and different, um, bodies of government being effectively exempt from civil liabilities. Right. Um, and so it really comes back to this, um, can, you know, this notion of like, okay, you you want to kind of prevent, uh, protect civil, civil liberties. Um, this notion of political neutrality as a prerequisite for for that. Um, but at the same time, in order to maintain, uh, this notion of political neutrality, uh, you also need, um, sovereignty in that system against any bias or coercion. Um, and this kind of brings you to this sort of, um, latent cypherpunk dream, uh, which is kind of popularizing as a network state. Right? Um, but it's an older idea that kind of, uh, has been around for a while in different incarnations, but certainly in the 2000, uh, with Jerry Everard's virtual states, um, or, uh, the various essays of crypto anarchists in crypto anarchy, cyberstates and pirate utopias is this idea of, um, effectively creating, um, in a independence in cyberspace, right, or territorialisation of cyberspace? Um, and that might sound really Farfetch at first glance, but, um, I forget exactly when. But the Department of Defense officially recognized, uh, cyberspace as a domain of conflict, um, up there with, you know, land, air, sea and space. Um, the UN has recognized this at the sovereignty over partitions of cyberspace, uh, as well as, um, many nations are also, um, you know, certainly China, uh, want to uphold their sovereignty in cyberspace, right.

[00:46:23]

Jarrad Hope

Um, yet it is probably one of the only frontiers that are left that that also allows you to deploy governing services globally around the entire planet, um, and cheaply. Right. Um, so if you can create like a system, um, which upholds, uh, its own notion of sovereignty, um, in the technology stack itself or in its infrastructure, uh, then you are effectively challenging sovereignty, um, of nations over their partition of cyber space, and you're protecting the citizens, um, that are participating in it. But and if you can do that for against, say, like against one nation state, then you most likely can do it for all. And that becomes really interesting on, say, the world stage where like, it's arguably the only real anarchical society between sovereigns, um, you know, various nations that are capable of defending themselves, um, which becomes super fascinating for ethical rules based order. So, yeah, I'm kind of going on a bit of a tangent here, but like in order to to make some of these ideas realized, like there needs to be a technology stack that upholds these sort of set of principles or ideals, um, and understands why they matter, um, and encodes those into the technology.

[00:47:39]

Corey Petty

I think that's a great way of framing it. Right. It's like, what are the requirements for that ever existing? All right. So how do we build technology that upholds those requirements and meets those requirements. And that's ultimately what we're setting out to do is build something that allows for that to uh emergently happen.

[00:48:00]

Corey Petty

We have like we have our own motivations for leveraging this technology to build things. But ultimately speaking like it's it's free and open for everyone to use. So if we fail, someone else can do it. Or if someone does it better, they can do it. And I think that's like the like the emergent. This happening. Emergently like, how do you build technology that allows for that to happen emergency such that it's not dependent upon our specific success for it to happen.

[00:48:27]

Jarrad Hope

You know, I mean, that's, you know, why free culture movement matters and open source matters. And in documenting everything and making it public. Um. So, like, if we fail, um, then if someone wants to carry on the experiment, um, and sees the value in that, then they can as well.

[00:48:44]

Corey Petty

Or if they have some alternative idea on what those requirements are, they can take it and fork off and make their own version of it. Right?

[00:48:52]

Jarrad Hope

Absolutely. Yeah.

[00:48:54]

Corey Petty

And so like, I guess for us that that instantiation is, at least for now, three projects, uh, namely Wakou, Codex and Nomos. Can you walk us through those?

[00:49:06]

Jarrad Hope

Yeah.

[00:49:06]

Jarrad Hope

I mean, like Nomos, I guess is our, you know, quote unquote layer one, um, agreements layer, um, that intends to uphold sort of privacy. Um, on. On this sort of network level, uh, as well as on the, um, validator level and on the sort of transactional level. Right. So like private smart contracts, um, Codex is a decentralized file storage system. Um, this is necessary for holding, you know, larger cultural archive records. Um, you know, such as blockchain state, for example. Um, and Wayq'u, uh, is decentralized peer to peer private messaging framework, um, that allows people to communicate. Um. Without worrying about a global adversary.

[00:49:58]

Corey Petty

And that's like ostensibly that's the that's the same three pillars that Ethereum came out with. Like, you need you need storage, execution and agreement and then ephemeral messaging in order to build anything that is, um, reasonably censorship resistance and decentralized. Right? I, I hate the terms decentralized. It's a means to an end. But like censorship, resistance and privacy is the main, I think, focus for me as someone more focused on the technology side of things. And you do whatever is appropriate to get to those things that allows for efficient and like usable experience.

[00:50:33]

Jarrad Hope

Yeah.

[00:50:34]

Jarrad Hope

Those three protocols allow you to then create, um, a whole, you know, a necessary for a decentralized technology stack, um, to create a whole wide range of institutions or decentralized applications. Um, and in turn, builds out the economy, um, and the governance, um, of this, our instantiation of a network state.

[00:50:58]

Corey Petty

We'll be coming out with probably a lot more in terms of, uh, things you can dig your hands into, but, uh.

[00:51:05]

Corey Petty

I think.

[00:51:06]

Corey Petty

I guess it stands to I'm a good portion of the technology building side of this. And I'd say you're more of the political motivation ethics. This is why we're doing this. This is the problem that we see. Uh, instead of doing this, would you think that's reasonable?

[00:51:25]

Jarrad Hope

Yeah, definitely. I mean, there's definitely some overlap there, but yeah.

[00:51:30]

Corey Petty

Awesome. That was fun. Anything else you wanted to talk about before we wrap up?

[00:51:35]

Jarrad Hope

Not without going down a rabbit hole. So I think it's good.

Episode host - Jarrad Hope

Produced by - Christian Noguera 

Edited by - Christian Noguera

Logos Press Engine ©2024
All rights reserved.
DiscordXGithubYoutubeRSS
Built by IFT