Hashing It Out
Hashing It Out

Episode 86 · 2 years ago

Hashing It Out #86- Ethereum Founder Vitalik Buterin

ABOUT THIS EPISODE

Today we bring on the main brainchild behind Ethereum, Vitalik Buterin. In this episode, Corey Petty and Dean Eigenmann discuss a wide landscape of issues involving potential regrets of the initial Ethereum launch, where we are at today, how the ETH2.0 project is learning from the past, and what is blockchain even good for? This is sure to be a good one worth listening to multiple times, so strap in with a pen and pad, and enjoy!

Links:
- Vitalik's blog
- Vitalik's "The Dawn of Hybrid Layer 2 Protocols"

Sponsorships:
- Avalanche Discord
- Avalanche info
- Panvala Staking Pool Instructions

The Bitcoin Podcast Network Social Media

Join Our Slack: Here 

Hey guys, this week's episode is brought to you by avalanche. Avalanche solves the biggest challenges facing atheriums, developer and decentralized finance or defy community, that is, velocity, security and time to finality under three seconds on the first decentralized network, resistance to fifty one percent attacks, with complete support for the etherium virtual machine and all of the tools that have fuel defies growth to date, including met a, mask web three, dot JS, my ether wallet, Remix and many more. Coming, avalanche will be at parody with atherium. For Defy developers that want a much faster network without the scaling issues holding them back. Get started today, building without limits on avalanche by going to chat dot ava x dot network, that is, chat dot a v X dot network. Next, what else? Like to take this time to inform you on how hashing it out is teaming up with Pan Vala to help fund the ecosystem of etherium. How did we do that? Well, hashing it out podcast, the support with status I am and consensus diligence, as organized the staking cluster with Pan Vala, giving us access to a whole bunch of the Pan Tokens allocated to this round of gitcoin grants. The total and the hashing it out community has twenty nine percent of the pool of about one point five million Pan Tokens, which is worth approximately fiftyzero. This is the Fiftyzero that has been allocated towards the gitcoin grant of people's choosing, on top of the large amount of getcoin grant allocation by getcoined themselves. So you get additional matching by donating with Pan. So how does this work? When you do the nate with Pan Tokens to the getcoin grants we've selected at as hashing it out, your donation will be receiving matching funds from both the Panvala issuance as well as the Getcoin the issuance. This is this current multiplier is roughly about five x which you contribute. And so these are the grants we've curated for you, that which we think they're important for the ecosystem. So you don't have to go shuffling through the tons and tons of potential grants out there and getcoin grants. We've selected some for you that we think are important to help maybe curate your selection process. That is the Nimbus F two project blockchain security database, fuel labs, the etherium two, point out annotated specification, white hat hacking, load star a theorem client defy, pulse registry, lef style, standardized legal goods on atherium and solidity visualizer extension. If you don't remember all that, that's fine. Go to the link in the description and as links to all of these things, as well as a link to a cart that allows you to immediately get these into your cart to fun things as well. So how do you donate? First, you need to get some Pan Tokens. You can do that by using, you know, swap. Currently, version one has more liquidited than version two. As a note, twenty eight pin is worth about one US dollar and six thousand pain is worth about one F next, go to the geitcoin link and the description. Odd of the add the ones that you like to your cart, and or you can just follow the link provided in that description. That adds them all to your cart and you allocate your pan to the projects you like the most. From there, all the magic happens in the background. Would you get additional matching from both our pan volic pool as well as the Geitcoin grants pool to make your money stretch a little bit further and help the ecosystem of atherium grow faster. Also, a note, at the time of this recording, you've got a little under two days to get those donations end for this grant. Round dance. I know that's a little late, but will be doing this again further and further, continuing this process and continuing to curate projects we think that are valuable to a theium, to help you decide how to make your money stretch. Say in touch. Will keep this partnership rolling with the Pan Baal up help try and support the ecosystem a theoum. Thanks and at work. Welcome to hashing it out, a podcast where we talked to the tech innovators behind blocked in infrastructure and decentralized networks. We dive into the weeds to get at why and how people build this technology the problems they face along the way. Come listen and learn from the best in the business so you can join their ranks. I welcome back to hashing it out. As always, I'm your host, Dr Corey Pat with my cohost today, Dean Ianman, and today we have a special treat for you. We got vitalic Peter and on the show to talk about all kinds of things. We're not quite sure yet, but we'll. We'll dive into a second so I don't think you really need much of it the introduction, but I'llow you to introduce yourself. A...

...tellic. I wanted you to tell us who you are what you do while you're here. Yeah, I mean. So, I'm a tells boodor and I'm the coal founder of Bagloin magazine. I there wrote some articles for that for about two years. I gets. Yeah, I did some some other things, you know, like the stint of Blockchain, call it a theoryum, since then and I do lots of things. I mean, I do, I know, I mean thinking, protocol, research, all sort of all sorts of different topics. So I'm happy to talk about any and all of the things I've been up to right on we're to start. So, like, guess, for heads up, this is a relatively technical audience, more focused on kind of decentralized systems. We talked a lot about kind of the different protocols and technologies uses, the theory of another more popular blockchains. So you can speak as deeply as you'd like to. If we feel like you've gone too deep, I might bring you back to surface or resay what you've said and a different framework to see if someone else crapsed it. So start off like this is something me and deem we're talking about a little bit before give any like I don't how to put this appropriately, like I don't know regrets or things you would have done differently about a fum if you heard a start now. What is it? Something? What is the decision that you made back then that kind of got locked into the system that maybe constrained you a little too much? Yeah, no, I definitely have lots of examples of that, and some large, some small. So like just to kind of give a sample of a kind of really small and kind of in the weird things. Some like the fact that we use like hextory trees instead of bineary trees the store of the state, and that was originally intended so that we did it, we could access stories key is what four times if you were database access as, but it'll later turned out that it was completely unnecessary and made witnesses the kind of four times bigger than they had to be. And so now that's something that we just have to basically completely re architects and kind of replace with the bineary trees. So there's a lot of a kind of little decisions that we made like that and and other one, for example, would even just be kind of gas cost. So like we didn't really do a good job of of setting those at the beginning. So lots of kind of little technical choices in terms of kind of zooming out into, I guess, the bigger things. But behind those decisions, I think, and we definitely did kind of underestimate how much how much time it would take to kind of finish a lot of the things that we didn't start back in a two thousand and fifteen. So we're going to prove a stake and charting probably being a kind of some the two kind of biggest things, and had we known that it would end up taking us two years, we probably would have kind of designs the Road Map. I kind of even kind of differently thought about kind of how what the different teams would be working on differently and making a kind of a lot of decisions in that regard in a very different way from my how we are now. So I think it definitely a lot of kind of regrets and wasted time in that regard. Also, just you pretty much every technical feature that we're trying to kind of implement on the etherium side. Now you know whether it's like a count abstraction, whether it's market reform, like how much easier would it be a transactions had version numbers and all of those things, and other are probably example of something on the kind of less technical goal and, I guess, more political economics front. would be thinking like about kind of the multi clients kind of strategy, for example. Right. So, I see, I feel like anyone. I kind of we approached it fairly halfheartedly, like we yes said that, you know, all we really want to have a multiple clients. You know, we don't want to have this kind of single clients, you know, developer orristocracy. That kind of I perceives to be one of the major flaws of bitcoin. But at the same time, you know, we didn't really do a very good job of it. Like and when we started off, it was basically kind of almost a gifts man apoly right from the first month, and then parity came in for a bit and things became a bit better and then no, parody is kind of the team stopped, were stopped working on parody and then and then it's gone back to one client and now there's a couple of other clients, but there's still a kind of catching up. Also. Just another example of something I kind of once again more on this kind of...

...more economic side and the kind of community aside, that also has to do with us just kind of underestimating how long things would take, is that we approached a lot of kind of early decisions with some mindset that we'd be building for four or five years. And after four or five years the thing was stabilized and now it's looking like will be building for something like ten to fifty years and maybe after ten to fifteen years, kind of altogether, starting from US two thousand and thirteen, we will start to I kind of see a reprieve. And so as a result, you know, we started off with the even just one example of this would be as something like just the ether distribution, right. So you know, we had this kind of free mind twelve million coins, six millions or early contributor, six million to the foundation. Out of the six millions to the foundation, three million once to another set of early contributors. And so what happened is that like seventy five percent of the pre mind went to people who did work in the first year of a theory MS existence, basically, well, the first fifteen, sixteen months, but basically. But like the reality is that the people that kind of really deserve to get that funding, ideal, they would be kind of people spread out across the entire five years, right. So I mean now, of course it would be a kind of like much more difficult to like try to add more pre minds into it, into a theorem or a kind of do z cast style death funding or whatever. But had we had more for sight around those issues have to be at the beginning, I think we probably could have avoided a lot of a lot of the issues that we have right now. And so, speaking to that foresight, right, it's hindsight is two thousand and twenty. Is Easy to kind of see those lessons learned from the kind of friction we have today, right, and I guess thinking about that, why do you think, like, how could you have seen these things back then? Where you were you like looking? Were you optimizing for a particular thing and not seeing these issues at the same time? Or we were not thinking about how each individual decision was going to scale. Like, did you not consider what state bloat would when state but would actually become a problem because the theorium was wildly successful? You didn't quite foresee it happening that fast. Like there's a lot of small things, I think, that are difficult to like really blame yourself for not thinking about early on, which turned out to be an issue today. And there're also some things that maybe you, like you, probably should have been thinking about. Have you thought about that? That that perspective and what you could or should have been thinking about, or like things you just like we didn't know, we didn't know better back then. All, we had this bitcoin as the model, and so the thereom changed a lot, and now we know, and I was definitely guilty of like what psychologists called, if the planning fallacy in a lot of cases was as just like underestimating how long things would take. And if I was wiser, I definitely would, I think, could have even then not fall into into that trap. And like a lot of things, like even the state size issues, for example, stem from that. Like I think, how like part of the reason why I wasn't thinking about that so much might even have been like, oh, we'll have to be years of state sized growth and then after that will move to charting and the reality of the will move to sharting after like or even Dalis clients, after maybe seven years of state size growth, seven years of state size growth with twelve million gas web instead of the three million that we had originally planned for. So that's one thing. Another thing would be that, like when I started a theium, I knew like very little about even like just basic things about like starting a project, kind of like figuring out to kind of who the team should be in and things like this. So a lot of the kind of initial people that ended up getting very high positions early on in the etherium community are people who like basically would just like the first fifteen people to respond to my get email announcing the white paper, or people that I just kind of, I know, randomly got to know on my travels from and of the half year before that. So it's like I do feel like kind of the ther UN ecosystem has kind of wife or going to wipe the slate queen that kind of multiple times and a lot of occasions, and I feel like things have going to improved a lot in the last two years, but there was still a kind of a lot of time that we ended up losing as a result of that. So I say, yeah, like I guess, a lot of the technical problems and are really and of consequences of people problems. So what are you what are you doing with ease to to kind of avoid making those same mistakes so that in three years we don't run into the same issues?...

What makes the process more rigorous? Is it the fact that you have all these other client developers now helping you out or HMM? So one very deliberate decision that we made what east to a very early on is basically saying that the etherium foundation will not be running its own kind of real client team. Right, the e there of foundation does have a trinity, but the reason I say real and quotes that that trinity runs on Python and Python is a waste lower programming language than the others, and so it's kind of reface. Yeah, I mean exactly. It's a kind of reference and it's not meant to be like a performance know that you would run as a staker and it's and that kind of helps all of these other east two teams to be on a significantly more even footing. Also, I would say that we do have a kind of much more rigorous kind of research and spect developments of process this time around. Like every single one of the decisions in this fact is the result of kin of quite a lot of thinking and going to quite a lot of arguing between you know, myself, Dandy Adjustin Shell Way and some other people, and there's more thought going into it. There is a more of a and of explicit goal of simplification going into a lot of the decisions. So definitely trying to kind of do a lot in that regard, and and also just the kind of conscious feeling that you know, this could be the last chance that a theoryum ever has to make basically a serious re architecture, so we just needs to make a really good job of it. So you mentioned earlier like one of the main things you wanted original etherium was kind of a multi multi client implementation and you kind of regret that not happening early. Do you feel like we've almost gone full swing the other way and we have too many implementations attempting to be part of F two Um? And it's definitely possible, though I think that we're not doing that bad a job right. So, like I the Lsna test that work launched with for implementations and they seem to all be doing pretty well. And I do remember that at the beginnings the number of implementations that people were talking about with huge it was like something like seven or eight, and some of the ones that were kind of that didn't end up the sort of making it into that store have kind of naturally evolved into something that would be useful in a different context. I suppose. So, and even just in the trinity is and itself. Like we you know, we knew it would not be one of the kind of major clients that people would use of validate from the beginning, but you know, it's a pison implementation. It's still providing value. Lots of people use pithon and wants to access and of noble watching, the watching with pithon and some of those kind of the other implementations that words that aren't like prismatic lighthouse and a Nimbus and Taghu are also try and kind of doing some other things. I know like the harmony team. I think they ended up from merging with the consensus that kind of Jab a team. So that also ended up kind of collapsing two into one, which also helped. So I mean, I definitely remember having this here that we would end up like splitting all refords someone eight different clients, but I think for strong ones, that's all, it's pretty close to about the right balance. I'd say so too, my person my personal opinion, because they're written in different languages, which opens up kind of the developer pool for catching whatever thing they want to do using their language of choice, all using this the same winder lying data, which is I think it is quite important because like opening up a little more broader and not being maybe like specific towards theory or theory inm to. Like, after the amount of experience that you've had in the space, in the research that you've done and the problems you've faced, how do you feel like the blockchain space as a whole? What are some barriers that you didn't quite understand previously that you see nows, like where is this technology actually useful? Because when we when bitcoin and and if they're kind of exploded, we went on this full scale tilt of decentralize all the things. Let's let's let's just revamp everything that we've ever thought of and put it on on adapt or its own blockchain or whatever, and a lot of those use cases were stupid and you're never going to work. Like where do you where do you see this thing fitting now, and why do you see it like actually being useful in the larger social scale? HMM. So I can briefly go into a kind of one of the challenges that I didn't kind of appreciate in terms of of this kind of should we, can? We Watch, and if I say everything,...

I kind of push. So, looking at decentralized governance, right, like I think the the thing that I was hoping for five years ago was that, you know, we would be able to create a kind of optimal decentralized governance algorithms, the same way that we could create, say, optimal group of Staate Algorithms, and we could actually create like some kind of mathematical economics thing that would basically kind of direct money to ward, like one you know what we call public goods, I mean including protocol development, including these other things, and that the thing that we create, like we could even create a doubt that would be part of the etherium protocol or create a double that will and that this construction would kind of actually manage to kind if not have any really uncomfortable trade offs in it. And the thing that I discovered as part of my economics research, right, this was kind of the topic of that article I wrote. So I got to on collusion about one year ago, which is basically is that if you want to make a mechanism for a kind of a particular set of functions, and particularly like anything having to do with public goods, anything that's kind of vulnerable to things like the tragedy of the Commons, then the problem basically is that if you give people like, basically if you give people leverage, right, if you give people the ability to say, I am only sacrificing one dollar in order to give ten dollars to like this development team, which is something that you have to do if you want to fund anything better than by that, you can, but just having donations which are just totally insufficient, then you run into the problem of like, well, you know, what if the developer bribes people to vote for them? Or what if, like, people create many accounts and they collude with each other and they pretend to be a crowd of people and it hurts out that these kind of economic constraints are pretty fundamental and to get around those constraints you pretty much have to have like one is some kind of identity, kind of what unique human verification, and the other is some kind of antecollusion, so like the equival as the oil of like porsion resistant. Both thing like basically giving not giving people the ability to prove how they participated, and there's a lot of kind of technical trade us in here and in England. I think that kind of a design. And so basically the problem that this sleep still is that if you want to create one of these public good funding gadget that has economics that are anywhere remotely close to optimal, then it comes with these that had of uncomfortable techniculture it off and the uncomfortable technical trade off. Star, I think, okay for what you're too. So they're okay for something like get coin grants, for example, which you know gets money from different from a kind of matching funding from different sources, but if it ends up being broken then those sources do have, end of the ability to switch to another one. But it would not be okay for something like, at the base way or protocol and it would not be okay for a construction that tries to maintain a pretense of being fully autonomous. So I guess that's a kind of one of the kind of unfortunate negative results that we that we've seen. That kind of definitely makes me a little sad, but but also definitely just kind of illustrates, I think, what is the key reason and behind some of the failures of a trying to kind of create decentralized protocols, that decentralized and much more powerful functions in the space. So the reasons why I like things one delegated forrow of stake and up collapsing into being really plutocratic if you try to use them for funding. You know the reason why you have these kind of bribing scandals in like steam and strawn or whatever. So that's yeah, so that's one example of like a concrete thing that I guess I had hoped would be possible, but now it turns out that it's only in a possible with the significant tradeoff which means that you can only like you can. We can still do it and we've done and it has great results, but you can only kind of do it with caveats and you can't, and like you can't really do it at the end of multibilians over skill by putting it right into the layer one logical a blockchain. Aside from I would even say that, like most of the things that it turns out you can't do on a blockchain without serious tradeoffs, end up having to do with like either these collusion issues, like basically where, if you have an anonymous system, you have no way of proving that the participants aren't colluding with each other and the economics only works that people aren't colluding with each other, or it has to do with the fact that the blockchain, just like doesn't know things about the real world, and to know things about the real world you need oracles, and oracles becomes subjective, and then that's kind of requires, you know, one of these systems like Auger, and that limits the skill...

...at which they can operate. So in a lot of in a lot of cases of like things that we wish we could do and and actually like what it's it's not that simple. I think it ends up being one of those two, because it sounds like, if we call the digital scarcities that we're making money, as money can't talk by itself. That seems to be the main issue. There has to be something else attached to it from a human perspective, or like identity perspective, that makes the actual like human systems are trying to build on top of this actually work without without like these, you know, side cases of collusion or like malfeasance and whatever you're trying to do. So, and I think another way of thinking about this is that if you're creating systems that tried to compensate for a kind of complicated flaws of humanity, then your system has to contain kind of built in models about what the flaws of those of its participants are, and those assumptions can, use the way, turn out to be very false. And so like trying to make something incredibly general almost makes it so you can never actually build in these and more specific systems at the base layer and they're delegated to something built on top. Right, exactly like you can make very general things for some use cases, you know. It's like you can make given USWAUP and it works great, but it's it's a very kind of concrete one. Duralists of applications, it's not like you know everything. A couple of things are you happy with? A like the the pace of development across the board, or is it is it? Has it been too slow over the over the years and, as I mentioned, it's definitely slower than I had hope. So I guess my hopes probably were unrealistic, though. One thing that we probably could have done, I suppose, and I wonder if this would actually be, this is me taking out loud, of wonder if this would have been net better or net worth like for the move to proof of steake, switching to some crappy proof of state protocol like they'ven just cloning whatever was in peer point or nxt at the beginning and then product kind of upgrading to like Casper FFG overtime. Like if we had done that the baby, we would have proof, like proof of steake of some kinds kind of sooner, and then we would have better proof of like potentially even now, and then we would have better proof of stake, like still a year from now. And like, basically, I wonder if, like earlier on, had we made kind of those kinds of pragmatic choices of like let's make a simpler thing first, and then do the more complicated thing later that we would not have other like. Might we have had a better, better system or kind of generally a better kind of path of progress than we have today? I don't know. You mentioned kind of like a proof of stake and I think it's been argued quite a bit, but I'm in the camp that proof of stake and proof of work for the most part are civil mechanisms and that underlying consensus hasn't changed too much, just depending on how you put the whole network together and actually get the consensus. And we're seeing, in my opinion, on novel consensus come through with Avalanche. Is that something that like could be reasonably used to come to agreements on specific types of things? Now changes drastically, like the architecture of the entire network and the way the data is put together and what you're actually agreeing to, but it seems as though from my perspective, that's a really good way of coming to consensus. And then you use something else to deal with civil mechanism. I mean, ultimately I think I know what map like. The Civil Mac mechanism doesn't matter more than the specifics, than the can sensus mechanism right. Like I think I like more than half, probably more than two thirds, of why people are excited about a group of steak is, like specifically the benefits of like switching from miners to as the base them and the things that inherently come with minors being the base to stakers being the base and kind of the things that inherently almost sakers being the base. And a relatively small proportion has to deal with things like, you know, the concrete specifics of like, you know, finality, involvement, as two hundred confirmations and six seconds, like ffgl of de Ghosts. In any of these details, I would argue that. I would argue that the main point that I think a lot of people miss is the separation of consensus with the data that's coming to consensus about proof of work. Like, like the the consensus mechanism is very dependent upon what you're agreeing upon, what transactions make it into a specific block and proof a stake getting somewhat yeah, because, like you, you can't come to an agreement on something until you make the block which kind of change like gives you an inherent heartbeat of the entire network. As, Oh, I see you're referring to like kind of batchby badge consensus as...

...opposed to separate consent. The funny individual thing. Yeah, I think it. I think it pigeon holes you into specific architecture by making that decision. Yeah, I agree. I mean, though, I do think that, all things considered, bash by badge consensus is superior and ultimately it is going to win. Like, I think there's a lot of benefits of badge by the badge by badge approach that people don't realize. So like, for example, even like like clients, right. The I like the eye, the possibility of like clients and comes from the fact that you have a kind of a blockheader as the stand in for a block, which is self is only possible if you have batching right, and if, instead you just have transactions as your base unit, and if you have a event transactions and your dig structure has complexity ovent, that would make a that makes like client techniques fundamentally much harder, whereas in a Badh by a badge based system will, like client, has, I got a fundamental complexity all of one right. So that's one example. Another example is like, if you'll look at the economics, like basically in a badge by in a system that's not batch by badge, like what actually makes the one transaction win a relative to a competing transaction is much less clear, and so like the economic analysis of like how do you actually make sure that there are like if you have, you know, ten percent of the state, you can't do crazy things to give you an unfair advantage. And like helping transactions get included is like that sort of a denomic analysis becomes much more murky, and I actually think that like the approach of block bash by that block by blocked insensis, where you basically say, you know, every ten seconds we elect the dictator who chooses the blocks and to the dictator just has the ability to choose whatever. And the dictator is the kind of what economists called the full marginal claimant of transaction fees. So if they get more he's they get all the revenue. Like there's a lot of economic simplicity from that kind of model that I think a lot of people are missing and that a removes a lot of nasty edge cases. Then is it what's more fundamental here when you're talking about their systems, because I guess, like the innovation and a lot of ways it, especially for permissionless public block chains, is a digital scarcity that people can rely upon or have more reliance upon. Is this? Does the economics become the most fundamental thing to that system, or is it the underlying technology that facilitates the economics? And the technology is kind of both. Facilitates, you can mix and is economic. Right, like the application, a lot of the applications that happen on top of these platforms, they're clearly and it's economic in nature. But the reasons why, you know, miners are validators are honest in the first place, instead of just like shutting down or trying to attack the network or doing whatever others do. But thing, well, that is ultimately because of like economic arguments, right, it's because you have he is rewards and sentives and I'll and all this stuff. So you can't really EVALUA. weate the technology as in isolation from the economic aspects of what makes the technology work. Sure. So, if we kind of move back to the to the idea that we all kind of agree that development in this space is a little slow, I think from you, from your endpoint, Metallica came mainly from like layer one development, specifically on a serium. How do you feel about development on top of the Theorem, though? Has that? Do you think that's at a pace where we think it's good? Because like about and you also, I should a good point about like dows, where I feel like everyone kind of realized that thus may not work that efficiently either, because, like last year, everyone was talking about dolls, and that's the thing we're talking about. Is No longer doals. It's defy Um, and I think we cand like kind of take, I came it by case approach to this. Like I think it really does differ buttent depending on kind of which vertical you focus on. So like, for example, DFI, obviously, you know, way more action than I had expected there would be. Before you continue there, I think it's just because we gave defy a name. I mean it's we've been talking about at the entire time. Now we just have a more concrete, understandable name to it. Yeah, and there's there's different aspects of Defi, though, and and you know, we have the thing, the aspects of DFI that we for that we got a predicted back then, and we have like stable coins. We have died, we have fairly a good decentralized exchanges and we have a lot of that kind of simple stuff and I and like to me, like over eighty percent of the value of DFI is...

...basically you know, stable coins or like synthetics more generally, and the decentralized exchange. Like I think like those two things combined are basically like by far the most important part of that entire space, and then everything else is like much less important relatively, even though in the short term we can give you like hundred and forty percent ingers rates or whatever. You know, people get excited about for a week before they forget about it because the interest rate is disappear. Subsidies. Yeah, some absolutes are great. You know them. So, okay, outside of defy, well, it's just kind of walked through a thing by a thing. Privacy Tech. I think at the beginning slower than I hope, but I think it's massively picked up over the last year. So we have tornado cash and that's like a simple, dumb thing and you can use it, which I think is really important. Umbra, which is a stealth address as on a theium. That got done at a Hackapon and now it's that kind of continuing the work. I forget the ULS either umbred on cash or something similar. They gets going to be releasing soon. There's also the aspect team. That's so work. That's the been doing a lot of work on creating a kind of much fancier version of privacy, preserving a layer two on a theorium. There's Zexy, there's there's at least a couple of Trinito cash alternatives. I even forget their names now. So a lot of things that kind of large and small. I mean zero know, which proofs in general have made huge progress in the last year. I think kind of what I call the polynomial commitment revolution. This is kind of this academic shift that happened in September last year where people realize that you can just use polling nomial commitments as a base ingredient and then just describe everything as computations, that we're polynomials, that you can make these kind of really powerful and really generic constructions. That's set up development in the space by like a factor of ten. And now we're talking about you know, look up and these zero knowledge VM's and a lot of a lot of stuff. So huge progress they're smart contract wallets. I keep wanting them. Less progress than I hoped for, I think. Yeah, I worried. I didn't worry about those a lot, yes, and I think probably the reason, the two reasons behind that. I mean one is, of course, that just smart contract code in general is like proves to be harder to secure than we had expected, and there are smart contracts wallets that are that seemingly how to first solidly secure right, like there's no sae safe and there's even the original multi sig that had as the etherium foundations, five hundred eighty thousand ether and as, and there's a couple of others, but it's kind of the other big part behind it, I think, is that the smart contract wallets are just fundamentally kind of more difficult to work with because instead of sending a transaction directly from your account or sending a transaction from one account then into the contracted into another thing and like, it starts getting complicated. And even the privacy technology, I think of, like really held back by the lack of account of abstraction for a quite a bit of time until we figured out how to work around that. was like the gas station or with the gas station network and so forth. So yeah, part contract wall. It's definitely less than I had hope. The I pos and there's definitely some people that are doing is probably a bit less, a bit less than I had hope. But if though, maybe about as much as I had hoped, given that people are kind of less interested in, you know, launching tokens and coins in general than they were two or three years ago. Looking at Leir to scaling protocol, I mean I definitely kind of admit to having a huge egg on my face for not pushing the idea and the importance of roll up sat two years as sooner than I actually did. Like. The idea was there. I mean there was even that thing called shadow chains from my blog in two thousand fourteen, but it would just somehow it did not realize how important it would be. But now you know, Zek roll ups that huge progress. We have two of them on Maine, optimistic roll ups that making huge progress and those can support full smart contract. So lots of, I think, good work on those. Mean Plasma. I think there was a big hope at the beginning that plasma could eventually be kind of made generic into a whole vm type thing and the reality ended up being that, like plasma. I wrote this as part of my blog post and if I forget the the name of it, I get it's one of the I think it was from August on. It's on Vateli DOT CAA, and it was or all the dawn of...

...hybrid lawyer Super Protocols. There ideas. Basically, the problem is that the can all mix. The plasma just kind of fled to best really rely on certain assumptions about the thing that they's securing and kind of the responsibility is that people have and have with regards to the things that they're securing, and so plausema just kind of chance fundamentally be jen be made Generica in the way that we had hoped, but at the same time plasma can be really useful and valuable for like specific application, right and Omg, for example, like you know, it started off with a lot of excitement that a lot of people kind of gave up aint it and now you know, it's back and like it's actually going to launch soon and it seems like it's, you know, going going to do a thing, which is great. I just think baby a little bit of my ivity on how General I could be. And now, through the research of trying to actually implement and make it work, we came to the realization of like, Oh, I can do this thing and just that. So if we make, you know, a system that only needs to do this thing that, then we could use it for this particular behavior. Same thing with state channels and that kind of revolution, and and so on and so forth, and I think that's their those were all priest cirkers, to kind of understanding the relationship you're trying to engage in with whatever product you're trying to build and using the appropriate technology for it, as opposed to saying we have this technology, it's supposed to do everything exactly. I think that's definitely very fair. HMM. Are you sad about plasma's faith because it was like kind of one of your brain children with along with Joseph Poon, right, and it was you guys invested in a lot of time. Do you think that the fact that it died, we at least got or kind of like, I mean it's practically that now. Do you think we got enough useful research out of it to justify the amount of time and the amount of people that we're working on it. How do you feel about that? And I think the things that I'm happy about is that I feel like we've kind of explored and maps out the layer to a design space to the extent that we have. We feel like we have a comprehensive picture of it in a way that we didn't before. Like we feel like we have a kind of comprehensive view of like what are the fundamental categories of designs. So like, for example, I feel confident in saying that classed by and state channels are basically the two categories of layer two systems in there isn't some missing third category and like the mathematical kind of characterization there, as you know, if you have o of n stuff off chain, then if he have o of n stuff on chain, then it's not what, it's not all, it's not one of these foll layer two is if you have zero stuff on chain, then it's the channel and if he have a of one stuff or kind of what's in no event stuff on chain, then at the blot. But and like you can kind of figure out and you can actually concretely say like what specific properties is, what specifical vitation those three things have, and then within roll ups the way and understand basically what the at this point the two flavors are as well. So I think that's been good and I think well, but it's definitely not completely dead and, as we said, no, OMG is like actually being a thing and it's a big place, but as definitely likely to be the correct orchestra at architecture for at least some applications. I do think that kind of the poor application complexity of building a plasma system means that plasible would be used for like a couple of very specific kind of individual applications that a high value and and the not in and if the long tail and the long tale is all just going to jump onto an optimistic roll up that weren't that kind of copies the Avma, how do you see? To God, pretty so with all the one of the issues with layer two that I think was like pointed out from like the very beginning of working on state channels and plasma. I remember discussing it in like I think it was two thousand and eighteen with the guys at spank chain when they were starting their stay channels, is like how is interoperability going to work on this are people like, do you see people now working on wallets that are supporting generic state channels? Are Stuff? Because for me it feels like I've never used any layer two solution yet, and probably one of the reasons I've never used to yet is because it's like not accessible in any way. Yeah, and I think this is a very helid criticism and I think it has not been adequately resolved yet. And I mean I like even outside of the context of stay channels, like even, if you willk at like roll ups for example. Right, like I've been doing a lot of kind of hanging of specific users and saying like Hey, guys, you know your four percent of a theory of the etherium gas usage and you know you are just doing a bunch of your twenty transfers. You calease move on to a roll up, but what it takes to move you want to a roll up, and I say about yourselves and US, a lot of gas money and the lack of wallets...

...support often is a major issue. Right, okay, so, realistically, I do think that we are going to have to kind of get like just the kind of bludging our way through that problem somehow, and and we are starting so like, for example, I know loubring is coming out with their own walllet. That would, of course, a kind of natively support their own roll up. Yeah, no, men. On the application side there's a couple of applications that are just do payments, that are interested in moving to roll ups and so, you know, whatever they do at all, the will just have to come up with some kind of a solution that in the next couple of months. I do think that what this means in the short term is that the thing that's easier to scale is this is some is things where the scalability is like. The thing that needs to be scaled is operations that are within the context of a particular application, because if you have operations that are within the context so particular application, than the Ui for the application itself can just contain the scaling logic. Right. It's kind it's like, you know when you're on loopering and you trade on the whoopering decks, like that's all stuff that's happening inside the roll up, in the whole point of a deck as that, if you have a lot of activity happening in the bulk of the activity is kind of inside the saying and not on the edges, but a lot of other applications have activity that's on the edges and that's more annoying because that means that you have to kind of upgrade the application as at the same time as that kind of pushing the users on to roll up some more. In general, and on the State Channel side, I expect stay channels are primarily going to be valuable within the context of specific applications. Like I saw there was that state channels, that Demo, I think it was about, got released by a clam and his team recently for channels to pay for streaming, and that's the sort of thing that you totally can do it just with Meta mask within the context of a single application. So that's another example of something I'm less worried about. So yeah, and I guess I'm definitely more bullish about, like channels for streaming payments than I am about, say, a Chit like a lightning network, ty saying, being used as way, a primary way for people to pay each other, like saying like payments says you'd pay. This is easy to scale technically, but it's harder to scale socially, precisely because, like the users of payments are not kind of conveniently concentrated within one particular application community. It's something that we'll just have to figure out. Yeah, I think with like optimistic roll ups, for example, which are a pretty simple Construt to get scalability going and, as you said, would allow for a lot of these things that are using a lot of gas to be able to move off chain, I feel like one of the problems as well as there's no developer incentive, because right now it doesn't seem like for me there's a framework that I can spin up in like half a day where my users get given this. It seems like there's still a lot of leg works that I have to do as a developer. Do you think that's one of the sufficient very true. Yeah, I think very true. Today optimistic roll ups are definitely working in the direction of trying to resolve that. Like they're trying to make solves maximally vm compatible, make it a kind of one like compile, to push your application onto a roll up. So I mean, I'm one of the big reasons I'm bullish about optimistic roll ups is precisely because they're the solution that requires the least amount of per application. Developer effort out of like pretty much ever anything that we've seen sea. But that's also part of the burden of being a developer and understanding what you're trying to build and in the appropriate technology for building it right. Like you don't like, I guess, if you're using the model of Web AB development and like no JS today, it's not very fair because we've had decades of framework building, of a Bush of extraction where you can do, like you know, in PM, install whatever the hell you want and it just kind of works out of the box, relatively speaking to previous days, right, and this is such a new technology where we haven't had the time to build those frameworks and the knowledge of what piece of tech fits into what, depending on what you're trying to do. And it's like that, like you said, like yeah, it's a problem now, but it's kind of like a growing a growing pain. It's something that's inevitably going to happen as you have a piece of technology that works and an understanding of it and then the kind of development experience of making sure it's accessible to everyone around it. Like talk, thinking back of some about something you said earlier. Wallet functionality is is maybe lacking or walllet development's lacking in terms of bringing the user into an application. I would argue, and maybe this is from my perspective...

...of working at working at status and having a wallet insider application, is that, like infrastructure as a whole across the ecosystem is relatively poor. So the ability to pool information from a decentralized standpoint is is relatively difficult, much more difficult than I would hope it should be for kind of the ethos of any peer to peer system. So what do you mean by pulling information here, getting information from the blockchain efficiently? We have very few like, especially historical information for that matter. Right this is very true and it's now. This is definitely one of the things that I just keep hoping we can put more resources into. Like the thing with fetching historical information is that it should be trivial to even just make a cryptual economic protocol for it, or even just like a very dumb sty channel payment solution. You know, I ask for one receipt of something that happened in the past, you send it to me, like I pay you some really tiny amount. There's a lot of ways to do it and like it's vastly less complex than, you know, full on charting or like anything concentus layer, and it's still just not something that we've done. Yeah, I as that what I mean. I've if I were to think about it, it's because we kind of have a single momentation on east one and not a tremendous amount of resources going into expanding with an information does but than maintaining it. This is true. And another thing is definitely that if one is in a lot of ways, like more complicated than it needs to be. I mean even just simple things like, you know, the mercal branches being these really complicated P truition for tree branches with our Alps like and a barrier and other things. Yeah, there's there's definitely the fairly little functionality. There's also a fairly little work being done in to a lot of the kind of secondary components that would be needed for a lot of the markets to operate. So like, for example, even something like peer to Peer Network, so that are just a dedicated to passing information around. Right, something that could Basta channels like work much more easily like if but you know, I'm if I'm online and I have one addressed, you're online and you have another address. Like how do we pass information between each other, information other than, if theory, I'm transactions, like being status. I know you've been, you know, working on with for and like your chat is just increasingly good and that's a great and there's a great job on that. But just standardized thing that kind of network across the ecosystem and see and also just like solving some of the problems, like figuring out a good spam prevention strategy for it. It would just solve so much in sort of what we can do, I don't know, and I you we can like politically push for more funding going into teams that are like trying to solve this problem. You can just work and work on like standardization of what exists. So work on trying to kind of improve its techniqu improve it technically, and it's HMM, I mean one question to ask is, like what's in example of a success story of something that you used to a fucking get you an equosystem and now no longer sucks, like actually, what would be your answer to that question? No one has asked me that. Go ahead doing from a developer standpoint, I'd probably say something like truffle, but that that doesn't really that's deliverer to developer. tooling has grown tremendously over the past like since. That's ex actually the amount of that that is because of the amount of people using it. So that might beg the question. Are there just not enough people using this other stuff to kind of justifying that? I'm in financial investment to like it's US complaining about it, but is it really that much of a problem that that needs let me give you a perspective here that maybe, like guess, helping towards why this is the issue. Is that when you created an a theoreum, you created you created solidity in such a way where it incentivized people who you you brought in a very large pool by making solidity look like Javascript, right. That was a very ubiquitous language. It's very easy to grasp. People had a very good intuition on how it worked and it brought in a tremendous amount of people in creating smart contracts and developing applications on top of atherium. But that what that does. It drives a lot of people to a specific part of atherium and none of that part is the actual infrastructure. It's just using it's building things on top of Atherium, and so it no way we scaled o atherium and brought me to popular did we attract people into building out the infrastructure that served the majority of the people...

...coming in? Because none of that stuff is like friendly to developers, especially like something like Jason Rpc cashing and building an infrastructure that's not fun in any way, shape or yeah, this is true. Yes, and now the etherium, even like the etherium protocol, is definitely a kind of just opaque and I got in a lot of ways that I don't like and that we're trying to kind of improve upon with east too. Has that been brought up in the process of developing Eith to? Is like how do we make infrastructure better and more widespread and easily accessible? Because, like, when you think about the early days of the ideal, it's everyone has their own note, or a lot of people have their own note, and you're not, as this thing scales, slowly moving towards us more and more centralized set of people who delegate data to those who need it. We're building on top of it. You don't want that to happen. How are you making decisions now that don't lead to the same conclusion. HMM, and it's and I feel like I feel like there is a kind of general Etho stuff trying to just make the different parts of the protocol like kind of more legible to people that are trying to interact with it in different ways, but coming up with a very specific instances of that is a bit trickier. Again, I mean I I can jump in there, which is like, at least with east too, there was a very long discussion on whether Jason Rpc should be continued to be used for later APIS versus like rest Apis, and there was a long discussion about, like, you know what, we should probably switch to rest exactly, because it addresses these issues of it's easier to load balance, it's easier to cash and all these things. So I think in that aspect, I think, and this is probably so from the benefit of having more client teams working on east too and having people who are more developer oriented working on east too, because, like everyone who's worked it was working on east right now, is probably had to work with like Jason Rpc and is probably also had to work with a rest epid and those that it's that much better that's definitely true. Cool. One other example, and I feel like with Sasz, like the number of lines of code you need to write to understand like one specific thing about one specific part of the protocol is a lower than it was in Urlp. There is there's like a budge of little things like that that were a yes, trying to do. And then and there's also a kind of layers, layers over the protocol other than they can send us layer and those are like a bit further away from myself. But I do feel like kind of in the community in general. I meanly, and was even like saying, like Jacon, our receivers, as rest is definitely a well and example of that, and they feel like there's probably other examples too. That's something I wanted to bring up now that you mentioned it. We're running out of time. I'm not sure how how long you'd like to go, but at the beginning this is something both like in the microcosm of a theorems as well as blockchain in general. In the early days it was much easier to stay abreast of all of the things that are happening. As time goes on, that has become utterly impossible. There's no way to be a domain expert in the entirety of Blockchain, or the entirety of atherium for that matter, as much as it's hard as you can try and like have a social life, is contribute to do all other things you'd like to do as a human. So how do you, like what advice would you give to someone who'd like to jump into the space, and how do you approach this subject? Where you spending your time, especially as a kind of the lead scientist of the theium, in kind of being the dfied entity you are in the space Um I, and I think it's kind of the same thing as what happened to any other academic field that kind of starts small and grows over time. Right like at the beginning you can kind of wrap your head around everything, and then there is always like five or ten people that are in the right place in the R at the right time that end up like inventing fifty five different things, and then after that, like the low hanging frud get picked and you have to kind of either specialize or and focus on a few areas or unders understand the big picture or realistically, kind of specialized in a few areas and understand the big picture of all the other parts at the same time, which I think is something that's still can be done. The andough the amounts of just kind of cross communication and kind of listening to what people are working on. That kind of different channels sort of load, has said, definitely increased a lot over the last two years and mostly in a good way.

And I think the other good thing is that we've made progress on just making the different parts of progress of the ecosystem more eligible to people, like even just like updates on the etherium block is one example. Just the very the surrends on like the various different you know, eath research and magicians and other corners of the of the Internet is another example. So there's definitely in the places where you can look to kind of understand a lot of the technical things that are moving forward, but at the same time they're definitely is this kind of big load of information that you just have to kind of get to know over the course of one or two years. Like I remember even when I started getting into Bitcoin back in two thousand the events as you two thousand and thirteen, and I started off like not really understanding much and then they joined, you know, bick went weekly and the different magazine and started writing about the things that I do understand, and it often at the beginning more the social things than the technique, the technical things, and then you just kind of get more experience with the system and you just kind of learn more things over time and the just you definitely should not expect it to be fast, but it definitely doesn't end up happening. So what do you do in your free time? Like all fast free time, that's true. That's a funny word for you. But like what what interests you outside of this? Is it stuff you continuously kind of look into? You like doing that has nothing to do with entire space, like what a whoe? What is it? A TAC like? That isn't a there Um. And read a lot, a lot of things, and there's a bunch of books in the most recent one I started is the the enigma of reason, which had one while recommended to me, so I'll see if there's kind of anything interesting in it. I'm in about ten percent of the way. Ruined so far. It feels like it's just the elephant in the brain with less synicism, but I guess we'll see if that's kind of holds up for the rest of the book. He mean? Yeah, I guess a lot of just kind of reading, thinking, kind of writing about a lot of these kind of a theory im adjacent things. Did you get some? No, I think we've kind of almost crossed everything. I could do this for I could do this for hours. So I could keep going, but I don't know how much. How long? How much longer meteloquence to so no, I mean I still I still have sometimes. Okay, then I'll jump back into something which I had noted down before when we were talking about getting data from the blockchain is close to impossible. It seems like constructions like optimistic roll ups will only make that harder. I could feel like I would say that. Yeah, and I would probably say there are, at the same time, efforts that are trying to kind of make things easier, like, I mean even things like in cubes, for example, like these kind of crypt economic light clients, like people were trying to kind of whack on the problem and get to the point where there is a solution of some kind. I do think that to properly solve this that get data from the blockchain problem. Like ultimately the solution will have to be either centralized and stuffidized by Joe or it'll have to be paid and paid around, is more sustainable in the long run. So Hey, and basically me, I think select channel payments, but it does mean that we need to actually have the infrastructure and it does mean that people need to be prepared to like basically pay transactions. Is Not just for sending transactions, but also reading certain kinds of information. Like I do think that there should be a freakier like I think that the reading the information needed to just sink up with some watchchain and verify it should be a something that should be free, because, like, how do you even talk to the chain to make your payments until you verify it? It basically but accessing history. That's obvious way like not something that we should stop considering as being core functionality in the context of an optimistic role of like a lot of these things end up boiling down to some form of like basically pay for mercle paths and pay for mercle paths like they you know, you have the Hash on chain and you have and like you want a Marcle path with some index. That's a primitive that we can make fairly generic protocols for. And so if we kind of build those protocols once, then we could end up making a lot of progress in and just like making them re usable. I actually have some pretty confident...

...that optimistic role of will end up being a kind of a catalyst for a kind of making all of this stuff work well, right, because one of the properties of optimistic roll ups and what I think people want to deal with them, is that, like you, they don't want everyone to run one of these full notes that actually verifies everyone of these transactions, right, like I think they expect most people to be kind of like the effect, to sequel, certainly a few other notes, to have full state access and for the rest of the participants to be just like, basically like clients. And so if we can just like get that structure, figure it out, then I feel like we can make a lot of progress. So I think this, this, the state of access is you you're talking about, is a general problem across competition, right, like where does access live and how do you access it efficiently with with with strong proofs, and that's not necessarily only a part of blockchain. And so you feel that optimistic roll ups are a good generic way to solve that problem, thus giving us a lot of scalability we're looking for. Optimistic Roll ups are one case of the problem and they could be a good catalyst for making good solutions to the problem. I would say, okay, so that may end up becoming something like, I say, plasma did, where it was a great idea but in the process of implementing it we learned a lot of things which then led us to a more general solution and eventually we'll have just like this is how you do the specific thing of data access with this amount of you know, get validity guarantee. I think so I can say and I think like optimistic roll up, will be the thing that we use at consent a layer, but kind of the infrastructure around them and efrastructure around like how people do that access, for example, is definitely something that just needs to go through a couple of rounds evolution. Sharting is, I think, also going to be another catalyst because, like, sharting is fundamentally about almost everyone being a white client with respect to almost the entire system, and so that will just kind of for set it Tu I should be developed. There's are doozy. Do you see? I see issues coming up with sharting in terms of crosshard communication and the difficulties that are involved with that that we may not foresee as it stands today. Like, how do you feel about that part? Because, like was we you expand this as you as you break up the state of the underlying blockchain and how that state of shared across multiple shards, that that adds a tremendous amount of complexity to something that's supposed to be that I guess I would call the simplest part of a system like this. But that's true, I think. Well, and a lot of it depends on the application. Like there is a lot of applications that will be able to just see mostly keep doing the same thing that they can do today, and there's a lot of applications that would have to change. Another thing is that the challenge has relatively much smaller if we're talking about creating as a massively scale moost that somewhere each individual application only has fifteen transactions a second, versus creating individual applications that scale pass fifteen transactions a second because, like, as generally active within applications, a lot of the time the simplest way to do it as to just make everything be serial right, and if you want to improve on that, then you have to paralyze the barrel eye and paralyzing it as hard. But then across applications there is like in a lot of cases there's that kind of like less assumption of kind of like serials communication being required in any cases, not in all cases, right. There are these cases where people expect to be able to do these weird, like fancy flash alone things where they do like seven different things at the same time and they collect like four dollars and twenty five cents of Arbistrush profits and pay three dollars and sixty cents of transaction these for it. And I definitely think that, like a lot of those markets are going to become somewhat less efficient as we move into sharting. But there's a lot of applications that I think would be able to would be able to survive, like even just token transfers. You could do those. They think Arnous. We just find unit swap. You can like transferring to the unit swap Shard and swaps and transfer back and a lot of other decks has work the same way. Even things like stable coins, you can move either the tokens or the CDP is between shards as you want. So there's a lot of the as applications where I think there's actually a like not that much complexity increase at the application level. But at the same time that's definitely not true in like literally all cases. I want to make, want to make an analogy that's kind of near and dear to my heart to what you just said and I...

...want you to tell me if you think that's in it like an appropriate analogy, and that's of computation. As the hardware has scaled to multiple cores and yet the software for for for right, for building on those things pretty much main like remained a serial applications for the longest time, because parallel computing is hard over time, like the longest time, as we built better and better and better processors for computers, it was just a single threat or single core and you built software that utilized how fast that cork can go. And so we got to a point of limit and physical limit of how fast that can go and then we started increasing like a Moore's law by just adding more and more cores on to a single processor. And spreading up the workload across that. But for the longest time, even though the hardware was capable of doing much more processing because it had much more flops or clocks on the die, the software couldn't keep up with it because people didn't understand the paradigm of splitting up workload appropriately, and so you had this lag of actual benefit and the software because people didn't know how to utilize the hardware appropriately. And so I'm seeing a similar analogy across this and this with the professed like in terms of sharting, is that, as you shared just sit other applications may lag behind because simple application still work on a given Shard serially, because that's kind of how we think about things, and over time will learn how to build applications appropriately, to utilize shards and break up that work appropriately, and that's just not here yet because we haven't really had the opportunity to do it. That's yeah, that's possible. Like I think, like what what might end up realistically happening is that, say, every application that needs to talk to other applications will just be on Shard Wad and then all the activity that's not of that word will just move to randomly or it's a random other shards, and that kind of architecture might be able to guide of get us all the way up to, say, a hundred gps and then, like it'll be a theory, gradual process and like the gas prices on the shard where everyone talks to everyone, also will kind of start off alow when we'll go up here really slowly, and so individual developers will just kind of start facing the incentive over time and we'll just go to see it the work being done over time to try to make more individual applications and of more capable of running in any secret discontext. So I don't necessarily even think that this is something that's going to kind of suddenly hit everyone all at once, since a transition that's the big relatively easy to to sort of use into. So looking at sharding and all this stuff which is coming into east to it feels like east too is going to be a far more complex beast to understand in comparison to each one. Does that matter? Is that a problem that like now, learning about east one and how it works probably takes like a week and after all this stuff is launched from east to it, or once east two is completely launched, it might take you, like a developer, a month to properly understand how everything works. Is that a problem? Because it feels like to me the the the set of people who understand East too, is quite small. That's yeah, that's fair, and I think this is a bit a big part of why, you know, we kind of value a simplicity so much in the ease to design process. Basically is, you know, it's one of the design principles right at the top of the document, because we know that creating a scale will blockchain that does not depend on everyone processing everything is just already pushing complex say, up to a pretty high level and we can't afford anymore. And so try and like trying to really are they're going to push everything that we can down to being as simple as possible. And there are things that are being simplified right so like, for example, the moracle trees are being simplified. I think, I guess as that it's simpler than are all be in a lot of ways. We don't really have on calls and we don't have an each to itself, like does not like a lot of most of the complexity of these one is like at the vm level and they need to feel like we're not adding an entireman UVM, which is which is good, I feel. I think, and we've even like abandoned some of our kind of more expansive attempts to sort of revamp the VM at the same time as revamping other things. So and at the same time. Look, we've done a lot of things to try to make the pro the protocol wor in yourstandable to people. So I guess. So the answer is like, yes, it is a problem and I think on net, it's probably worth the tradeoffs, giving that the getting that the trade offs like basically by dig away from people not like not to have to pay a seven dollar transactions,...

...is just to do a couple simple things that they do today. But like, given that we are making the tree, making the tradeoff, and we do have a responsibility to try to make it kind of as a way as possible within the context of charting happening. If it works, does it matter? And this is what I think like a more of a philosophical argument of like if F two works appropriately and people are able to do the things we'd like them to do with the guarantees and ideals that we kind of set out to do. Does it matter if people understand it? Well, if everyone understand it, like how accessible do we need to make it? Because, like if the people who were working on it understand it or abble to like discuss it, move forward word and do the things, and that's what matters. To Do we do? Should we have to care if it's accessible to everyone? And I mean, of course that's on the far end. No, not everyone's going to accept it. But like where's the threshold? How far do we have to go and why? Is Our definitely as far as we can. Like I think even like a big part of Bitcoin's appeal, for example, is the fact that you know, you can understand how the protocol works even as a high school student, and that's something that not a lot of modern technology has so trying to look like. I think if you deviate too much from that spirit, then you start creating a system we're like, okay, it's just like spooky magic thing and only thirty nine people get it. Then they you start getting into it. What. Why should I even trust the thing? Right, like a protocols and only thirty nine people understand as real adjust and other kind of trusted set up. So that's something that we want to avoid. And also, like, you need people to understand as enough so that like, for example, people that are kind of sufficiently disconnected from the original development process, so they're not going to have the same biases and you might have different biases, would be able to just look at it with fresh eyes and see if they find any like bugs or even any things that have properties of the protocol that have bad consequences that could could be fixed. That's a big I think. I think that's a really important point, like, because for the longest time we lack the like threshold of legitimacy across the entire block chain ecosystem for professionals and various degree and fields to come in and weigh in. And if you have this complexity problem where it then weighing in is a very, very difficult they're never going to do it. That's yeah, I think that's very true, and it's making the protocol symbol is definitely really important to a kind of analyzeability. Though, in addition to the protocol itself, there's also the presentation of the protocol. Like I think protocol is can be presented in ways that are more simple versus more complex, and we can talk about what different parts of the protocol in isolation. So like you can put it up into you know, okay, here's the proof of sake and the end. Here is the like lmd ghosts and the finality, and then here charting and her committees, and here's data availability proofs. And if the if a protocol can be kind of factored out into the kind of mostly separate components that have a specific function, then that tends to be supports to grogs. And if you have like really complicate the cross the pendanties everywhere. So that's another thing that we can write that in a keep in mind. They're great for not I think that's great way to wrap this up. Doing anything else? Nope. All right, so Thewiccau is there. Is there anything that you would have liked me to ask you that I didn't get around to asking you? I can't think of anything. All Right, I've definitely thank you for coming on. We appreciate your time and hope to have you back and keep up good work now. Thank you. Good.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (127)