Hashing It Out
Hashing It Out

Episode 92 · 2 years ago

Hashing It Out #92 - DefiSafety - Rex Hygate

ABOUT THIS EPISODE

Today, Corey and John talk with Rex Hygate, head of DefiSafety, a company seeking to perform process quality audits on Smart Contracts in the Ethereum Community. This novel process helps is an attempt to bridge the gap between security between the community and the projects they interact with.

Links: DefiSafety

Sponsor Links

The Hashing It Out Social Media

Slack

Donate to Hashing It Out

Discuss

New RSS Feed

Hey, everybody, got an awesome episode for you today, but first gonna do a word from our sponsors as well as give you a little more administrative stuff going on in the back end of hashing it out and things are involved in that you can you can become a part of. So first off like to thank you, thank our sponsors, avalanche. Avalanche labs, the highly scalable open source platform for launching to centralized financial applications, recently raised about forty two million dollars through a public sale and now gearing up for its next milestone next week, the launch of it's a main net on September twenty one. That's right, they're launching their main net. So it'tember twenty one. So get prepared also to strap their ecosystem. Avalanche opened up a bunch of new grants for developers who want to build higher performance defy. That's decentralized financed applications and infrastructure. They have open calls for projects like a decentralized exchange learning, daps, step cooins, with more at it every week. So they also accept applications for other decentralized projects to join the avalanche ECO system. So go bill and avalanche, build it up limits and you go learn more at Avel labs dot org. That's a VA labs dot org. As for what we're doing it hashing it out. There's two things I want to talk about. First is hashing out is a part of the Pan Vala League. If you don't know what Panbala is, look back a few episodes and we did an episode with Neuron about what Painavala is held works, so on and so forth. It's really awesome project that we're happy to be a part of. So this round Pan Vala is donating about, I think current pan prices, about a hundred and Seventyzeros to the etherium community. How does it donate those things? Where does it figure out how to donate them? Well, the Panavala League has geitcoin grants and hashing it out as part of the Panavala League. So we have a geitcoin grant that basically is a multi sick and if you donate to that geitcoin grant with Pan it will get matched not only by the clr matching of typical getcoin grants but also additionally by the hundred, seventy thous that Pan Vala is giving out and then we're going to use that money that's raised through that grant, with the advice and decisions hold from the etherium security community, to fund security and infrastructure projects. I we believe that hashing it out, that security infrastructure is a very underfunded but incredibly vitally important part of the ecosystem that needs more funds. So we're going to try and do that and you can help by donating pan or whatever to the good coin grant, the getcoin grant. That's going to be in the description of this episode. So get your pan donated to us. Will Fund a good place for it to help SUC MELP the security and infrastructure of the theorem ecosystem. And other big news that I don't think I've mentioned on the podcast yet is hashing it out is leaving the Bitcoin podcast network, because the Bitcoin podcast network is no longer a network, it's just a Bitcoin podcast. So over the next maybe ten or so episodes we're going to be continuing on this this feed that you're subscribed to now, but in the process there's going to be a new feed that's only going to be hashing it out. They'll need to resubscribe to because at the end, you're not going to be able to get it on the feed you're on now the Bitcoin podcast. So we're going to have their own thing. We're going use new branding, try and add of some more resources and so on and so forth to the show so that you can be a little more stable. I don't know, we'll see, but we're going to have own feed. Check out, listen up for it, check the twitter see whenever we are published. That, but at least you get to just listen to us and no one else. It's going to be great. Bitcoin podcast is going where. I'm still doing that. It's just two different feeds. Now and onto the shop. Now entering past work. Welcome to hashing...

...it out, a podcast where we talked to the tech innovators behind blocked in infrastructure and decentralized networks. We dive into the weeds to get at why and how people build this technology the problems they face along the way. Come listen and learn from the best in the business so you can join their ranks. All right, welcome back to hashing it out today on your host, Dr Coy Petty, with John Mardlin. Say Hello John, Hello John. Now you know John's voice. Many talks and today's episode we're going to be talking with REX Highgate, the progenitor and founder of De Five safetycom. We're going to get into what that is, white works, how it goes on. But first off, x everybody. A little introduction as to who you are. I joined the space which are here for. How are you doing? So I actually started looking at a theory right after the doubt, which was really cool, but and I just lurked and then in early two thousand and eighteen some people from consensus reached out to me and we were chatting about unrelated things, but they said if I wanted to do something I should go to hackathon. So I went to F Denver in two thousand and eighteen and there I meant briant and we started secure a dot org which was focused on software process and documentation for making blockchain software, because we saw a lot of similarities. I'm an aerospace guy. We saw a lot of similarities between aerospace and we started focusing on that and that went through two thousand and eighteen and then kind of fizzled and Crypto winter and such, and then defy safetycom is a covid inspired business, because I got finished with my day job and suddenly had a whole bunch of time and I wanted to see how I could contribute in in blockchain as it was coming up. And I'm not really a financial guy, I'm not a coder. So I took the roots of the concept of secure ath looking at the the the testing process and the coding process and the documentation process, and from that I came up with the idea of a defy safety score through a defy safety audit, and I invented the process and in July we went live and I think we've got twenty three audits completed now. So quiet John. What's in these audits? Can you tell us? So the audits are a sequence of questions in four sections. I look at the executing codes, the code on the blockchain, and it's things like is the code there? Can People see the addresses? And I only look at publicly available audits or publicly available information, so information that's on the website, on the GITHUB, medium articles, but stuff that a user can easily find. If so, I don't. I try not to look at private stuff. And then it's is the code being used? Is it verified? Does it match what's in the software? Repository. Is a software repository healthy? That's one section. And then in codes, in documentation rather, it's like is...

...there a white paper? Are the requirements documented, like have they said this is what the thing does in a separate document other than the code? And does the do the requirements or that that documentation fully relate to the code? And then is it sufficiently detailed in in comments in the code, and is it possible that here's where arrows real aerospace like thing can you trace from the software, from the requirements to the code and to the test, like did they put in traceability? And that's something that I would say most people don't do yet. And then I have a set of questions for testing in a set of questions for audits and and WHO's the main audience for both like consuming these things as well as going out and like purchasing these things. So the main audience are users of defy products and it gives a level of trustworthiness of the code and down in like one number of percentage and a color red, yellow, green. So that's my main target market, so that people have an idea of whether or not you should trust a particular application. Are they being public? Are they showing their all the information that a normal with etherium product would show. And the obvious question here, which I would ask if I was listening to this or thinking about getting one, is like, what's the turnaround in price of something like this? Like what what's going to like as a user? If, if I care about a project and I don't understand potentially how to do this due diligence, but I'd like someone to help me do it and I want to maybe invest, I would like some much stronger compidence interval on something like that. Like how soon am I going to be able to get that if I come to you, and how much is going to cost me? Is it worth it for me to do that if I'm going to invest in something? So we're still we've only been doing this for two months, so I wouldn't say we've got a real market price. It will be under K and generally results can be done in about a week, you know, depending on how many people and what I'm working on at the time, but roughly so it's a relatively quick process and it's not an awful lot of money in order to be able to go through the whole process and what comes out is a report, a detailed report, looking at each one, and then you know you can. For a developer who wants to improve, you can sit back and talk about it and often there's very simple ways to improve. Sometimes, if you're looking at adding documentation to your code, it's it will take a real investment. Or in the other side of that, like I'm like, as a developer, I get and I get a score from you and I'm like, oh no, that's that's bad or I don't agree with that. How do I then either like how do I argue against something? And then if I decide that, like yeah, he's right about that, we don't have that, we should do it, and I improve it, does that immediately increase my score? And how would that be? How it how it do? Like, how would that be updated? Okay, so we're all about improvement. Improving things is immensely important that, if you so, I'm very much in favor of the developers coming back to me and saying what do they do wrong? How can I do it? In fact, underneath every question in the report is how to improve the score. Unless you get a hundred percent in that question, then they delete that. But so and I give our references a lot of the time to the secure a documentation to give an idea of what I want. And you know, the other thing you can do is, let's say I want really...

...good documentation. If you went on the website and you looked at who got the highest score on documentation, well, synthetics, their documentation is to die for, and you go and look at that and you say, okay, so that's the gold standard, and you also see how much time and effort went into that. Then it gives you an idea. So there's a lot of resources that the developer has to improve on it. Now arguing, I try to make it as quantitative as possible. So I'm looking at something public and judging it and a lot of the questions I give guidance. A hundred percent means this, zero percent means this, forty percent means a sixty percent means this, so that they can look at it and have an idea. So actually I'm probably have the wrong document of if I look at a blank auted say for audits, a hundred percent is two audits that we're done before deployment and you put the improvements in. So two or more audits. That's a hundred percent. One audit is ninety percent. The funny thing with audit is no audit is twenty percent. Zero percent is saying you have an audit when you don't really see because, and I have found a case, at least one case, thankfully, that is not very valid. I very comment. HMM thing you mean? I mean they may they don't have a claim and there's nothing you can see or a go even like like gone to the supposed auditor and found out that that they were. I've seen and gone aud it from them. Well, yeah, if the audit is hidden, then it's as if it's not there. I audit public things. If it's not public, I'm not going to give scores on it. And actually I'm at version zero, point five of my process. I've committed that I'm going to do zero point six. And a big thing that's happening in deafie now is a lot of people are having private repositories because they're afraid of getting forked. Yeah, of a did that, but that was pretty easy because they they simply made a single github which had their final release product, so it had the test the documentation, everything was all there. So it was easy to audit, but it wasn't a healthy github. But it was because they have a healthy GITHUB, but it's behind. So so that was relative. That doesn't really protect them from forking. It helped. Don't text the from compreer, I suppose. Right, right. Yeah, they do. We work with all a. They do their development in the background, Lik in you know, private get labry bow and then when they publish stuff, they had a purpose for that is so that people can't take the goat before they release it to Puy the contracts as they would in the release form, and then and then attract the majority of the audience before they're able to do appropriate marketing to get their own community behind it. Yeah, but however, though, I think there is like even, you know, post release we're seeing very, very freak up forks of Yams. For that maybe it's worth I don't know. We can have a conversation, but that here in a little bit, because I have I have quite a few feelings that I haven't like fleshed out that I would be interesting from perspective, from Yell Too. But I'm also I'm a little warns that I first off, full disclosure, like I've talked to rex about this a lot. I'm fully behind it. I'm now the, I think, soul owner of both securest and F security dot org and I have a strong desire to help rejuvenate life into those projects somehow another...

...process. Quality audits will more than likely be a strong part of that, because this is something that I feel has been lacking in the system for a very long time and I think it's very important. Like giving people the ability to make decisions, to do their own due diligence or even to know if a project is capable of being of doing due diligence on is incredibly important and what what you're providing here is exactly that. Yeah, the idea, and for I get I mean I'm getting one or two requests a day from users. All of them are repeats and I would say most of my audience are smaller users that are looking to invest their money and an awful lot of their focus is on the new hot stuff, which are these forks, you know, Sushi, Yeam Pylon. So there's an awful I would say most of my audit requested for the smaller audits. At the moment I'm splitting my time between the forty on deafie pulse and these requests and at this point I'm not really charging, but I think I'm getting into the field where I will be offering for charge audience audits for people to, you know, sort of skip the line and I publish on my telegram channel, in the website, sort of. These are the next eight audits in my list and and then a you just clunk yourself through the list like next is mutual was. I just finished them yesterday and when I finish the audit, because I don't ask permission when I do the audit because I'm looking at public things. But when I finish I go to the developers using the best method that I can, discord, telegram or whatever. Often it's the discord technical channel, and I'll give them the audit before I publish and say I did this, here's the results. What do you think? A big thing is, did I miss anything? I remember one of my early audits was of a and I never found where they had their smart contract addresses. So that was a huge thing and it was just Oh, do you didn't look there and it was rather fun. Be It. That's not you. That's the beat. Generous are like the the the structure in which people do things across the board is incredibly even within status we don't do things consistently across the board for all the products because there's such a decentralized shoot team right. So, like it's not obvious on where to find these things and those these things cause us to like you doing that as is a really good idea. Yeah, and and there would be more value inconsistency. But at this point that's not something that I'm trying to grade right now. If I can the I simply say is it public information and can I find it? But the main point is I always give the developers a chant and to read it and sometimes we have discussion, sometimes very extensive discussions, with a couple of them on why did you have this score and things like that. So obviously in the beginning most people just kind of didn't react. But now the the process in the name in the space is a little better known, so and the the website is a bit better fleshed out. So when somebody sees a product in a score they have a better idea. And obviously when there's a good score you don't get so much feedback. Sometimes when there's a low score you still don't get that much feedback because a lot of these ones, the the smaller ones, they're not interested in the process. They're interested in making the money early and you really get that impression when you start communicating with them. Do you feel like like by even...

...putting those projects through the process, you you're maybe just risk validating them or amplifying them? Like I feel like there's enough reasonly high quality projects that you could just spend all your time on those and and be providing them with with some signal where they can improve. Yes, I mean certainly, if you go through the audit, you can see areas to improve and the I think I'm reasonably clear on this, is how you improve your score. So I hope to be able to get that. And then for the A, the the lower ones, you know it's a red flag that you know these guys are very public, you know, and when you're dealing with some programs where the developers are anonymous, the smart contracts are not published, there's no Github, you know it's a pure leap of faith because you know you basically have no clue. I want. You're putting your money on a leap of faith. It's you're getting into something bad, because if you're capable of creating spark contracts that do the complex things, that keep up with the other programs, then you know what's going on, you know how to do this stuff. You're purposefully choosing not to do so, and that's a that's a giant red flag for me. Yeah, and let's just say they get a red score. So it may not be a red thing. That's really difficult, red John, like I think what you were talking about here is like people. There's a lot of projects that will go out and get an audit and then just say they got an audit and then'll care about the results, right like it's a marketing tactic in order to get an audit and the don't care about the result. A guide. I think this is a little bit difficult to market that because it's all based on public material and it's it's it's visually visceral when something is bad, as opposed to like an audit report, which you could like to and no one's going to read, or if they do read it, it's not likely they're going to understand it or see that things are fixed. This is like this is bad and they have it improved it. So when I look at the audits. I actually do read the audits and I am not a smart contract auditor, but I'm pretty good on quality and you you tend to be able to understand what you're reading and you know there's been cases where there's been bugs found and they're not indicated as fixed. I'll jump into the code and go did they change that? Oh, yeah, they did. Okay, you know, so I'll I would say I tend to give benefit of the doubt, but I do read. So if you've got somebody who got an audit and maybe they make it public but the audit was all bad and there's no you know, the say was audited after deployment and there were no changes, that would be a giant red flag for me and it would the score would reflect that. So, even though my guidance on audits is relatively straightforward, if there's red flags when you read it, I will make sure that the score reflects that and then I make a note where I've, you know, gone against my guidance because of Blah, Blah Blah. So that's one thing that I will read the audit reports and don't also think about who did the audit, because I've had audit reports where it's like, I don't know who these guys are and they don't mention their company. There isn't even an emailer or an anything. And you look at some of the things that were found and there were the whole the audit gave a whole bunch of red flags. So they'll be you know, therefore, there are saying there audited it, but I won't give them a great score on that. Yeah, okay, that's great to hear. I think, like what one thing that I've thought a lot is so we've seen these incidents where somebody's been able to say, like somebody gets hacked and they're like, oh, but we had an audit and inevitably, like, people actually look at the audit and they're like, oh, it didn't actually, you know, the auditor even say very good things about it, but because nobody reads the...

...audit report, the the product, the developers were able to represent like, Oh, we got an audit from consensus diligence were covered and and we're like well, no, did you? Did you read it? Like we fairly concerned about this. However, however like we are, it's also, you know, we don't like our job is to inform the developer and we don't want to. We don't have the time to play pr or people or to you know, feeled a lot of request from the the community to help them understand things. So I think there's a need for somebody to just like read the audit who can read between the lines, because, frankly, we get paid by the developer, right and if I am really brutal in my language, like it's not my job to say do or do not put money in this, and if we start doing that, we're going to get trouble as soon as we say do put money in this and we were wrong about it. So I'm just trying to inform the developer about like the risk, the quality level that we're seeing. So there's a need for somebody to read the audit report, between the lines and say, like this is roughly what I get from this. So I think that that's something that you're starting to do, which I think is very valuable. And also people need to understand that like audits or not, like a buoy same not safe. Yes or no? Yeah, that's safe. Like like nobody's the budgets just aren't there for us to do enough work to be like Oh, yeah, this, that's that's that's not even a budget issue and that's a that's what is an audit for issue? Yeah, it's, I mean like limits of technological and human capabilities issues as well. I unfortunately, one thing that I'm aiming for with this space, and we're very young and I'm not sure how to work, is I want it to be funded by the users, independent of the of the developers, funded by the users, giving information for the users. Whether or not that will be sustainable is an to be determined. You know, it may end up that there's, you know, a pool. You Go back, you look at the the rating agencies in two thousand and eight. You know you had a perfectly good system where the ratings were paid for by the people who are getting the ratings and after a very long time the system became completely corrupt and they were the ratings were not value able. So we're in a super young time in the space where we can develop a system that isn't quite like that. But you know, we're not there yet. Exactly how this will be funded. I still have a lot of questions and it's just planed too early in the time for me to be able to answer that, but my goal is that we're funded by the users for the users, not the way smart contract auditors are, with they're funded by the developers. To that real quick before I search on. I part of part of that funding problem is something I hope that we yet as hashing it out and some of the fundraising efforts are doing with Pan Valak and help oliviate. Yeah, yeah, we we have a Git coin grant and that's working. So definitely, at this point that's our primary funding mechanism, though we have had some people in the space who've said you're doing really good work and they've offered some money independently also. So first I just want to make a comment and then I have a question. So the comment is that it would be really interesting to see someone like yourself auditing the auditors, perhaps because I think that because, as you touched on, with like the ratings agencies, the incentives are becoming somewhat stronger for us to to gloss over things or to like we get...

...pushed back from developers on what we want to put in our report and the more we see like we see, see see other firms, you know, just like put like their audits are starting to look like certificates or stand and to have this like approval not approval, thing. And so the more that audits are used as marketing, the more the pressure is on us to pay for us to give someone a stamp, and so pushing back on that continuously is difficult when other people are willing to go there. So so it would be wonderful to see I'm just straight up like shilling my bis or like pushing my business considerations here now, but I would love to see somebody keeping us all on is to level that playing field so that we're not to reduce that pressure. That was a comment, just putting that there. I'll let you respond before I go to my question, if you'd like. I think you do. I am starting to understand the various audit firms. I mean I was with secure F I got into the F security telegram channel, which is a collection of the security auditors and and very security developers. So that gives me a bit of a feel of the space and the and the organizations within it. I'm not sure about being able to rate the auditors, but you know, at any point, you know when I see an audit report that is very very glossy and looks more like a marketing element and also, you know, is like a stamp of approval, which I haven't seen yet, to be honest. But well, I actually the best x V to audit seemingly had that and that's like number two on my list. I haven't gotten there. Actually audited Bez X v one after their first hack just to see whether or not my process would bring up any red flags, and it did, and then I audited open, which had a hack, and they got a really good score. So you know the process. Certainly the process is not a hundred percent valid and it'll be interesting when I audit be z x V to whether or not they get a good score or not, because I know they put a lot of security focus on the development. They they at least that's what I heard, that they put a lot of into it. I haven't had a chance to look at their ducks again since right so that gets my question quite nicely actually, which is what what are the limitations of your process and what what can and cannot be concluded based on us all in score. Well, the biggest thing is what it gives harvest finance, as a nice term. Honest work so what you'll end up with is somebody who's done honest work and is relatively honest in public about the the the developer. It could be the the financial incentives could be completely biased. They could have something that says on next Saturday I'm going to take all, you know, a statement that as our next Saturday. I put all the money into this and I won't see that. I don't read the code, I don't go over what the code does. So that's a huge hole and this is where you get a lot of points for if there's a if there's an audit, because then a third party actually has gone and looked at that. If not, it's, you know, the faith of the developers. So if somebody's done honest work but has dishonest incentives, I won't catch that. But what I've found in general is that the people that don't have you know, have suspicious intent, shall we say, often don't put the honest work in.

So I it tends to give a good indication, but definitely this, you know, it's not perfect. Tend to be doing money. Crabs are not going to be put in, put in a peripheral work to engage a community because most of the time they're relying upon green not like community buy in, as much I like like honesty and buy and I like are like long term. And so what you're seeing when you have a quality process, quality audit, it's like a high score for what you do is typically a an investment for a long term engagement, in my opinion, or an attempt to allow someone from the public to be able to contribute, understand and make good judgment. And that's the opposite of what anyone with malicious intent is going to do, unless they're like, well, I want to give any ideas here, but like incredibly sophisticate. I've I've seen some things that were pretty clear money grabs, but they were very, very careful about their their process and they got a good score and you know it's so it does happen, and you know they're very public about all of the stuff, but it's just really complicated and people probably aren't going to read and therefore they'll end up making their money back and more so I think it does happen and I just, as I said, I'm really trying to make a quant a tative thing on a very, very clear and published rubric of here's The test and a lot of the time I'll do a screen cap and put it in the appendix. This is what I saw. Therefore, you get this score and I try and keep it as quantitative as possible and so you get a consistent score that can be judged, amongst other things, that has limitations. Another thing that comes as in mind here I and this is thinking for my own personal like experience of the ecosystem as well as working in an organization that develops these things and gets audits. Sometimes, when I scope things for an audit, I only scope a portion of the project, depending upon risk, time, resources, et CETERA. When you're looking at these things, are you judging how much of the codebase is audited as of like, as a percentage of the total of the like the total project? Right, because some only some parts of a particular codebase or or audited and others are their left further today or they're deemed like, not risky enough to get audited. HMM. So I've seen like the big guys will often have had a big audit quite a while ago and then their recent release has a Delta Audit. So that would get generally a really good score. And whether or not. You know, in aerospace you would do a an impact analysis to say how much of the code has to go through. There is a rigorous process for that. We're not there yet and I'm not doing an impact analysis of the Delta change in order to make an independent determination of this is the the codebase that should have been audited. I'm just not in a position to be able to do that at this point. And of course, some of the things that you start to think about how sophisticated will this become as it evolves? Will it become a sophisticated element in the code or will it stay high level the way it is right now? A big thing that I'm also looking at is testing. At the moment,...

...and I did this deliberately because I was just starting out and I had no idea if anybody would care about this. I only look at test. Oh look, he has lots of tests, and you know, I can do a test to code ratio and once again you have an indication that they've done quite a bit of work in probably ten percent. You have some indications in the GITHUB that says here's the Code Coverage, here's the build and the test. In most cases, surprisingly way you guys are developers. So you would just say this is natural. The stuff that's letting the get left in the GITHUB repository gives you no clue if any of the test pass or what the code coverage or the the unit test. You don't give a report saying look at all past even on the code that's deployed. I mean that's just the way the space works. I don't think that's good. I think you know when you're deploying the code, please go out and make the test. Now a lot of people, where they're public, will say here's how to you know, go ahead and fork the repository. Here's how you run the test. These are all the dependencies and you go through. So one of the things that I'm thinking of now, now that I've got some traction, is should I be running the test myself? Should I be running slither and mythral and lint on the code and and get use the results and get that? This is a sort of a question I have in my own head, because I'm technical enough to do that. But will that be a valuable use of my time? Yeah, I'll tell you. Just like as somebody I one thing I have trouble getting developers to do is just like make it easy to run the test. Like often, if you go down that road, you're gonna like it, just start battling with truffle configurations and find out that there's like some MPM package that's like local globally installed on their machine that you don't have. It's always really heart like it should always just be get clone and PM install and PM run test to make it happen. And never works like that. And and I wanted to I wanted to like clarify something you said because I want people to hear this. Like what you're saying. It's not just make the test easy to run, but you're saying like just check the the coverage output in like whatever your coverage report is, like that should be an artifact and in the real soo right, I think it. No one does that. I don't know there seen that. Oh well, sometimes the GITHUB when they use coveralls, you actually have a and the nice thing is when they do a release. And something else that only some people do is actually have a named, and preferably release, but at least a named github repository. In this is the stuff that actually went on the main net because frequently one of my biggest technical challenges is just trying to find the the github repository that matches the code on the main net. HMM, and it would be nice if that was labeled. And then with that, if they had they okay, because this is going public. I ran the test and here are the test results. That would be wonderful and, as you said, hardly anybody does that. No, not that hard. So what I'm thinking about that, like what I'm thinking about as you're saying these things, and I thought about this before when I've talked to you about this whole process, is I'm hoping that the eventual success of this process, quality audit process. Can't stop saying it that way. So anyway, like you, being successful in this means that there's going to be a lot more available information on how to do this stuff and it develops somewhat of a standard on how to...

...publicly display information so that people can get access to this information quickly and easily in it a quality way like that's that's incredibly important. Is that standard alone. It like, and and what you've mentioned originally and rever firs start of this podcast, is that like designing and developing and putting something into production in the space is akin to is, it is akin to aerospace. You need to put a lot of effort on the front end and there needs to be a lot of like checks and balances and making sure that you're doing things appropriately and things can't things can't fail and that and like standards like this help people do that efficiently and quickly and then broadcast that they did that to the public, because you have to do that in order for people to have trusted you, and so like. That's that's why what you're doing, in my opinion, is a like almost a keystone and the development process and understanding of the development process from both parties, that developers and the people consuming the products. Absolutely the idea is to get a virtuous circle where people do the right things because they have an incentive to do the right things because they would be able to get a feedback. I mean, one thing that I was playing with is it would be ideal once you end up with it a good score, and I'm not saying the present defy safety score is the the end result, but it's definitely moving there. You can actually display that on the wallet when you're connecting to a an APP and when you're about to do a confirm and you would have, you know, a green or a red, and at least if you're if you're going on to a very spotty application, then you would have a red button and it's you know, if some kind of very strong feedback is added, then the developers have a very strong incentive to make that red look like a green, and if that means they go through their process in a good way, then you've really got yourself a virtuous certain I love that. I will probably fight to put that aside status. But a consequence of that is you being overwhelmed. Yes, well, then the company grows using a single process or it. I mean the process I consider it. I still haven't gotten around putting the proper life sensing text in all of our reports. But the process is public. My process for developing a definite safety scores is is public and I want people to go out and be able to use it. So, you know, that will perhaps help in people doing their in their development. But I mean there's there's like twenty questions here. It'd be not that hard for somebody to create a page. I think every project should have a like slash security page. You know if they have a bounty, that's where they're all the information should be about their bounty, their contact information. They could just straight up answer all these questions on that page presented there and make your job super easy and then say hey, rex, here you go and like yeah, get it done for themselves. They could self score. It's not rocket science by any means. It is, though. Today we have actually suggested that when, especially when on the telegram channel people have been clamoring do this, do this, do this, you know. So we would say here's the process to yourself, head it to us, you get Tldr. You know, I'm maybe not super concise in the report. It's a lot of pages and the eyes go glossy and they...

...just don't please do the report. So, yeah, that'll be a definite process it. You know, the idea this is it'll become a quality industry at a certain point as this grows, with lots of people involved in it. You know, standard processes and meetings and all of that that is existing in in aerospace and in finance now. So that's what I hope will take care of the overwhelmed aspect. At this point. I'm just trying to get it to grow and to become something that will be steady and and will survive a long time with in it. I'm curious how. So the process you do is all looking up at public documentation and what's already available. What would be different if if you were w working directly with the development teams and maybe if it was reframed so that you were trying to help them be better? I'm serious. You have thoughts about how you would approach that and what you would be asking them and what you would look at. So a lot of it is here's the process, fill it out or go to the secure earth thing, which hasn't been updated, and that's something that I'd like to contribute to that. But so that there would be a clear process, just do this and fill out these blanks and then you'll do fine. So that would be that. Obviously, from a consulting thing, which could be another revenue stream, I can come in and you talk and you you look at the various aspects and you talk about it. You look at the budgeting and the staffing. Like, do you have anybody working on documentation? One thing I found interesting. I'm finding a lot of people are using then spect parameters in in their commenting, but not many people using them to generate documents. So you're sitting there seeing this code and it's got all the natspect parameters. But you can use this and it could autogenerate seventy five percent of good documents the traces to your code and then just write a blob for each piece and you've got you know, you've really done a lot of it. But they do then spect, but they don't make the documents that go with it. So that's another point. But at least they're doing then spect. That's a step. So I do want to actually have a process for predevelopment and that's a might to do list, but it's not at the top cool right now. As you can imagine, I'm just one guy and I am stacked on just doing the defy audits, and especially right now when new things are popping up literally daily. I think I read that they were what thirtyzero deployments a month now, which is insane, where it used to be like fifty, and I mean mean this money to be made in defies. So you know, you you throw together, you fork something, you change something else, you create a bank and a bunch of money comes in. That's a pretty powerful incentive. So it's kind of nut right. Talk about that a second. How much of that do you think is Leshit of it? Well, if you look at pickle, they were actually found a new part of the problem that they were actually creating something positive with. I'm not an expert at it, but it's helping stable coin stay stable by making pools that help that. So I would say that new things are coming up. It's tough to say how much and I'm not a finance guy. So I would say there definitely is value, but probably a lot of them aren't adding an awful lot of value. And I mean one of the things that I really want to consider...

...when I do zero point six, and I haven't figured out how I'm going to answer this, is when somebody says I've taken Sushi swap and I've added a different token and I've changed this, this and this, and it's true, it's explained in two paragraphs in a medium article. How do I audit that, you know, because it would be you could sort of say, well, most considering it's all. There's a lot of synthetic code that's used in a lot of these things. So they're pulling in contracts out from good programs. How do I score that? And I try and focus on the new code, but I'm still scratching my head and I want to be a little bit more formal on how I deal with that, because that's a big part of the space right now. So I'm inversion zero point six of my process. I will have some words that specifically talk about that. I just don't know what they're going to say yet. And like that. That brings up an immediate question for me. As you said that, like you'd talked like the transition to few six, when you move from one version of the audit process to another, what does that say about the previous projects that were audited from a from a from a previous version? So sometimes, like when I went to zero point four, I the changes were relatively trivial on the previous ones. So a lot of things didn't change. With zero point six, I think everybody who has their own github repository I won't have any changes, but I'll have a better way to judge the scores for people who are private repositories. Like a negative of private repositories is he audit is done on code that you can't see so and it wouldn't be so that should be a negative in the score. There's nothing in there for it right now. So sometimes I'll go back and change it. But the report says this is on zero point five, and at a certain point I'll have to confront that, but right now I try and not break too much as I move forward. If we have, we talked about what you're going to add to the to the process in the future, kind of so zero point six. My list is anonymous team. Yes, no, I figure I might add something on that. I like that. That's yeah, better consideration for privately bows and better process for copies with minor changes. And then there's the whole thing of what I do for tests. You know, I think that maybe some tools that I could run. So I want to think I haven't. I totally get what you said about going down a rabbit hole and trying to do the test and that's not work that I want to spend. So that and then the other point is, do I start adding other metrics, such as how time mature? Is the is the the tool? So you know, we you look at a total value over time and you take the area of the curve and the bigger that is, the safer it is the time, without hacks or whatever. So defie score has that do what I start adding things like that? That's something that I'm scratching my head about out other you know. So adding more complex risk things is something that I'm scratching my head about. How to do it, like pool risks might be. You know, there isn't a lot on with this pool. Can I get my money back? What is the risk of me getting just the asset that I put in out? And then there's the risk of the the token that it's with tanking, which is totally different from the pool risk of just getting it out.

So there's a bunch of things that I'm thinking about, but I'm not quite ready. I'm certainly not an expert on that and I'm not ready to do that, but it's sort of how do I grow this into a larger, a broader risk, and should I? You know, and I'm the other thing is I can just do this same process if I find this is financially viable. I just stay in my little sliver and I do as much as I can on it. And you know, I can do it in daps also, there's a lot of DAPS for which it would be very appropriate to outside of the defy space. So exactly where it's going to go is still a bit up in the air. It's I mean literally I think I did my first audit on on July fourteen. was when I felt that I had my first three audits were out and then, you know, so this is super new and and where will go? I would suppose I either go broader within defy or I stay narrow and I go deeper into the e theory. I would echosist. I would say that I could the other. I think we'll all agree here. The breadth of risk analysis and in the blockchain space in general is as far further than that I can see. Like figuring out a quantitative way to assess risk across the myriad of potential applications that blockchains in general enable is an incredibly difficult thing to do. Yeah, I want to add that. The like. I think there's a cool opportunity to we need to start quantifying like the kind of like the complexity and the you know, the the tax surface of these things. So, you know, is it is it one contract, like is it wes? Is it just wrapped ether and it doesn't you know it, just every other contract in the world is the same to it. It's just an external entity that it does not trust. Or is it, you know, an exchange like like you know swap that just lets you add anything to it, may or may not be a token. You can claim to be a token. So it's like composability or just like how big is the thing? Like is it? I don't think line count is the ideal full metric, but it is an interesting input. There are a lot of things that I think would go into a kind of like complexity. U Risk. Yeah, score. I do like one thing in the code analysis aspect of it. I take I try and grab all the code, leaving outside the libraries and other aspects of just the core code and I run a tool called SCC on it which gives, you know, the number of lines, number of comments and a generic code complexity thing. Though it's using the javascript it's not it's not a solidity tool yet, but I get numbers on that and I was wondering about code complexity because some of these things are you end up with some big, big things that have small, elegant code and others they're trust monsters and you know, and that's in the report like the next is mutual. That's a lot of code. I haven't done Aragon. I've been told that's a huge monster. Also makers up. But so the code offuscation on. Yeah, so, yeah, it's everybody's pronical example of like pot code. So that's another metric that I'm generating and I'm not scoring and I'm not sure. And obviously one thing that I hold hope to mature up to is to really get sort of like people around...

...a table and really looking at these so that it's there's a bigger consensus on how the scoring and the judging should be done and hopefully, in the idea of creating a whole quality industry, meets like that would come up so that the process that I've developed so far would grow and have, you know, more, more buy in and the the quality of it improves. So that's something else that I hope will happen. And actually was zero point six, I at least within the the the group on the telegram. I intended, you know, have let people into the conversation, which will be a bit different than what it was previously, because the other versions nobody cared. So we're just the more you that happened, the more it becomes a your cradic nightmare of nothing, like everyone talking and nothing getting done. Perhaps, yes, then we can start it. Doubt there's always sort of downs. I think what's different here rex because, yeah, I was around when that effort started. It was a bit like like. I think the difference is just like, you know, creating a standard where everybody has a bit of a standard already. There's a classic EST xkc comic where it's like, Oh, there's eleven disjoint standards, we need to unify them into one dominant standard and then oh no, actually just have twelve standards now. I think that was the failing there. I think what is there's just more action here. You're doing something, you're putting it out, there's a number that gets people's attention and it's sort of it's just a bit more proactive and it's happening. What's also permission that it seems like the incentives are aligned appropriately right. It's like it's a it's a community driven thing. You don't have this weird relationship, almost almost secret relationship, with the developers and the odd end up in the auditors and the underlying marketing side of that thing, which is coming from the developer standpoint, and so like that, like there isn't a enthetus to persuade the audience from your side in any way, shape or form which gives it a larger kind of lean of credibility. Of My opinion, that's what worry being at. I mean it's, as I said, it's by the users for the users. That's kind of the the incent how do you what do you do when a user says, Si find the problem with this shit project and I'll pay you more if you give it a worse score? Well, I haven't had that yet, but you know, the on something like this, credibility is the only thing you've got going for you. So if somebody wants to pay me to get to drop my credibility, it's not going to be inexpensive. So, but because that would that would just ruin everything. So you know I'm and the other thing is the way I've done the process. You don't really have room for that because if somebody reads it, you know it's they look at the actual results and there's a grade and it should every almost everything's public. So you can't trust make up different scores and make stuff up as much. So that's the the hope. Like anybody can go take a look at a report, take a look at what's public and what I with websites. I try to do a screen cap and then put it in there with the link, so if it changes a week later, I...

...can at least say this is what I saw when I did it. Did it. So then it's more difficult for me to trust make a score that's dramatically different and I'm slowly refining putting in more guidance on question after question so that I have less opinion in in there as much as possible. The audit reports are going to be one area where it's it will be. They'll probably still be a bit of opinion when you end up with and audit report that doesn't look like it. When you read the detail. It just doesn't look like it reflects it, and that I'll try and put into the score and I'll make a note and that'll be, you know, the it's at least it's a discussion on a public document that people can have all right. So it's a great way to kind of wrap up the episode. Where do people go to find out more, see what you're doing, read the process, get in touch. Yeah, it's everything is on defy safetycom and then the process is in docs defy safetycom. And if they're a developer, just pull up the process, pull up a blank on it and fill in the blanks. And if somebody fills in the blanks and then just sends it to me, I'll put it up on the website or without any trouble. So that's another way. That's and then jump on the telegram channel and just help us out. And don't forget the the get coin grants. That is deeply helpful for the people that do find it useful. Thanks to kind of shout thanks for doing this work. If I think we've reiterated how much we find it helpful and how much you care about it. So keep up the cover. Thank you much. We'll see how where it goes. Thanks RACT CHEER.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (127)