Discover more from Kerman Kohli
Episode #1 - Intentions, Incentives, Crypto & Death
Available on Apple Podcasts, Spotify and Google Podcasts
I launched a podcast and you can listen now wherever you get your podcasts. In the first episode, I chat with Angie Malltezi of Shipyard Software. We talk about how she uses mental mapping to breakdown complex problems. We also talk about what it means to spend time well and how to align your goals with the way reward mechanisms are structured in your life.
Want to keep getting more content like this? Subscribe below!
Kerman: Today, we have Angie with us. Angie's from Shipyard and, I'll pass it over to her to give a quick intro and then give a brief summary of what we might be talking about. But, uh, I think it's gonna be quite vague but a lot of very interesting topics.
Angie: Hey, Hey everyone. My name's Angie, uh, formally I work at shipyard software and I do all things, operations and strategies.
So managing all parts of the business informally for fun. What I really love to do is philosophy. Um, I'm a big reader. I love to think about abstract ideas, which is like what brought me to this space. And I think Herman's very similar, so super excited to, to chat about all things. I don't know, like abstract concepts.
Kerman: Sweet. Well, we're actually just speaking before around the topics of death, funnily enough, and how, our time is so limited, that we should really learn to cherish it and thinking at least like in terms of, some sort of timeframe that, you expect to live till. So either you don't take the present for granted, was that like, Roughly.
Angie: Yeah. Yeah. I think that's a pretty good recap. Maybe can you like tell, I guess where the idea of like the app that you just showed me came from? Cause I very much think about that internally, but like you seem to have this like physical reminder, which I think is really interesting.
Kerman: Yeah. So there's a, I've been thinking a lot about like an iPhone and how it's like this really powerful dashboard device. So, um, the widgets on iOS are like these interesting pieces of information. You can surface to your consciousness every single day, cuz you check your phone, like at least a hundred times a day.
So there's this, uh, one app that lets you create widgets, um, for your screen. So I create a widget and then a simple countdown timer and you take a simple assumption that you're gonna die at some day. And you put your birth date in. And then it shows you a progress bar of how much time you have left. So for me, I, it was roughly like at the start of this week, I had 2,900 and something 0.7 weeks left, and now I'm 2917.3 weeks left.
And I'm like, ah, shit, that's 0.3 weeks gone. And there's no way I can get those back. I better make sure the next 0.3 weeks. Or like I'm living the most that I can with this time. Uh, so yeah. And you can change it to like weeks, months, years, whatever kind of you prefer, but it has to feel real. Um, and it has to feel like you're kind of like burning this quantity as if it's a video game.
Angie: is so fascinating. Cuz I do something similar as well where I tell myself like every day or even to like my friends and family. Yeah. Um, whenever I sense that they get slightly upset with me or, or kind of short because they're like frustrated with other stuff. I'm like, listen, like this might be the last time that we actually have this conversations, like pay attention, be in the present.
Where, um, cuz you, you kind of described something interesting that like, it makes me think about how people value their own time or like the concept of like what is deem worthy mm-hmm so for instance, for you, when you saw that like 0.3 of the week gone and you said, I better make sure that I spend the rest of my time better, like, what does that mean to you?
Kerman: Yeah. So this is come, comes into like another system that I've set up, which is like getting religious about time tracking. Um, and that sounds like nerdy in it probably is
Angie: sounds like a lot of work
Kerman: well, you can actually build automated systems that make it less work, which is really nice. So like, Whenever my phone recognizes that I am in the car, cuz it connects to the Bluetooth.
It will start a different timer. Or um, if I walk into the house and do a certain action, it will recognize I'm here. So technology augments a lot of these like systems that matter and like four time the, and I wanna write an article about this. It's about talking about. I call it the trifecta living, which is like, um, intent time and energy.
And like everyone has like intense that they're trying to express in the world. But in order to express your intent, you need enough time. And you need to have the right energy on that time to express that intent. So like when people like say they're gonna die with regrets, it's because they have unfulfilled intents and really what you wanna make sure with like utilizing your time is to making sure your time is aligned with your intents, which ultimately creates purpose in a sense of fulfillment of every day.
So. It's more, just a wake up of like when that 0.3 reduces you're like, have I gotten closer to my intents or have I made no progress? Cuz I've, if I've made no progress and I've spent time, then something's going terribly wrong.
Angie: that resonates so much with me, especially the word intent, cuz.
Like the three aspects that you articulated the intent time and energy is also some of the factors that I consider important. Mm-hmm . And so for me, and I'm curious to hear, like how you perceive what intention might mean to you. Like for me, I try to like set an intention with every. Kind of point of interaction that I have with another human being, because I like, I really value human relationships.
And so I set my intention before I enter a meeting a room, a discussion, or like before I like perform some sort of action. Just like, you know, like kind of center me in what my, my awareness, what do you do? Like what's intention for you?
Kerman: Yeah. Well, I think that's really cool. Like, that's the micro intentions that ultimately, like, I'd say probably cascade up and I'm trying to get better at like sending those micro intentions with like meaning agendas and being really clinical is something that is, uh, what we said previously, we're trying to do, but mm-hmm, , I, I, I think a lot more about macro intentions of like, for example, I am trying to get more fit.
And then it's like, there's this huge tree, cuz like it comes down to your diet, which comes down to your shopping, which comes down to your social, calendar, which comes down to like how often you exercise the kind of time you exercise, what you're exercising, the nutrients your body needs for that exercise, your sleep pattern.
So like that one macro intent touches on so many complex systems that it's like. Am I sure that this macro intent is the one that I actually want to proceed because the earth will shake if I proceed macro intention. And then I better make sure that my time is aligned with making sure this huge macro intention actually is successful.
Angie: Yeah. I mean, that makes a lot of sense, like rationally speaking, especially cuz you're like a data person. And so I imagine like that's like a multi-variable complicated problem that you're articulated. . And so that's so interesting. And then like when you came up with like your intentions, like where did that really come from?
Because I, I like perceive intentions tied to value systems.
Kerman: Yeah. So this is one that like, I don't have like a really tight framework for yet, but like my rough thinking is that it comes from values. As you said, experiences, insecurities beliefs and truths that you've learned about your life or the world around you.
So it's kind of like, almost like the conscious mind is all these inputs. And then these inputs get somehow. These intents that people have. And sometimes people don't know their intents and that's like the most dangerous thing. When you dunno your intenses, cuz like it creates harm in the world. But once you're clear about your intent, it becomes clear to other people which allows you to operate effectively.
And then also just makes you happier. Cause you're like, here's what I intend on doing. And you can tell yourself that you can tell everyone that and you have nothing to hide. mm-hmm .
Angie: Oh, my gosh. Yeah. There's just like so many things to unpack there. ,
Kerman: but I think one thing which I'd love to touch more on, on what you said with like meetings in like intent in a more professional sense when you're like, okay, I'm meeting with this person, here's the thing I think something that I really admired when we first spoke was. Uh, when you're saying like I wanna be intentional around say who are our users fundamentally. Um, and this issue that we see in crypto is like, kind of what is, how much value does a user bring and then like, what is the cost of getting this user?
And it's something that like, I think the space isn't intentional about because it's. I intend on getting users mm-hmm so I will a hundred million dollars into giving away free money and it's like, wait, but are you really being intentional? Are you really asking the right questions? And it seems like you've done a lot of thinking around that.
so I'd love to learn a bit more about that and how that's kind of like that thought process kind of kicked off.
Angie: yeah, that's interesting. We, we do something kind of unrelated, but it's similar. So we do something at work. Called like personal user manuals where we, you know, I think I talked to you about it and showed you through like our little framework to understand ourselves and like help others at the company that we work with, understand each other mm-hmm
And when I was doing mine, one of the things that I've always said about myself and like how I interact with others is that I like to think very deeply about a particular problem. And what I mean by that is like, I create a little diagram in my head that talks about all the different levers and the interaction.
So very much, you know, what is the ecosystem? So for instance, in crypto, that could be like the user, the product, the different interactions mm-hmm . And then I think about, you know, what are the different implications of the actions that we do? And so, um, you know, you brought up the point around, like, how do we think about the users and that alignment with intentions mm-hmm
And so prior to actually this call earlier, I was asked by somebody on the team, you know, what was it like coming from like web two and web three. and that is very much a Delta that I see is missing that like sitting down with a pen and paper and writing down, you know, what do you hope might happen?
Mm-hmm and if the attention is like, I just wanna get rich, like fine, like that's a sufficient answer. And like, there are mechanisms and like processes and like levers that you can pull and invest in. If that is like essentially what the intention might be. . And so I think one of the ways that people can honestly do that and think more deeply about, um, whether it's like users that they wanna increase, but they can kind of really work backwards.
So for instance, with the business, like that's very similar to setting like a business strategy. You really think about, you know, what is the hypothesis that you wanna test? Mm, and you break it down very much. How you broke down, you know, my goal is to get fit. And then you talked about like that little tree, like that is also like a, a framework that is applied to think about, you know, it's kind of asking yourself, like, why, why, why like five different times to get to an issue of like, why are we really doing this?
What might this mean? And like, what is the information that we're getting. That's kind of how I like process information. I tend to really like, just write everything down a piece of paper and try to draw the grammatic diagram. . And then I think one thing that I do that I take a step further that I don't see a lot of people do is just like, do the feedback loop.
If I move a step forward, I just ask myself, you know, did we accomplish what we intended? And I see that, that, that step seems to get missed. And I think it's just because people are excited and so like, they just wanna keep moving forward and that's fine. Um, but maybe it's that scientific background that was like drilled into me for, for 10 plus years into my life that I have to like go back and assess the loop.
Kerman: That is amazing. Uh, there there's so many like bits I wanna bounce off. So like before I bounce into like a different area, what, what are some of those like tools and systems you use to kind of drill deeper? Of course one is like a pen and a paper, but like, do you draw mind maps? And then you keep asking questions and those chain of questions is like mapped out in a visual format.
Like how do you explore your thinking? In a way that manifests itself in like your physical domain or digital even.
Angie: Yeah. So I think maybe I'll, I'll give some like context to like pick something that's a specific problem to business that might help. And I apply this thinking to like all aspects of my life.
But for instance, I imagine like most people set OKRs. Like they set a specific goal. Let's say you wanna grow your community. Yeah. And you might have like three or four different social media channels. Mm-hmm and I feel that. Yes, that is sufficient to say, I wanna grow my community, but then you have to like ask yourself a little bit further.
Like why, why do you wanna grow this specific, like your Twitter specifically? You know, what is the data that you have? That's gonna validate the rationale of investing time and effort into growing specific, like Twitter, you know, are your users there? Do people seem to relate? Is that specific to your industry?
And so, like, I just keep asking myself those questions over and over again until I find. That the answer seems to be, and you can do this with your own team. Mm-hmm and if, if you, you might come across to the end where people say, well, I don't know, other people are doing it. And I'm like, okay, like then, then that's not like a good path.
Like let's move to a different, that is always my gut. Like how I know that we should stop down that specific path because at least for me, um, the answer of just trust me or other people are doing is just like not sufficient. And so. But if you don't ask yourself that question of like, why do we specifically want to invest resources and time and team to let's say, growing our disc community, like, and then, you know, does 5,000, 10,000 a hundred thousand, like at one point, do you see like diminishing returns and like, does it matter to keep going down that path?
Kerman: Sure. So let's say the intention. I want to grow my community. Mm-hmm and it's like, the next question is, well, why? And it's like, cuz we want more users. It's like, okay, why do we want more users? Cause we want more revenue. Okay. so what is that train of questions that you brought through?
Like, let's say what the intent. I want to grow my community, like simple intent. What is the chain of reasoning that then proceeds like to answer or question that intent and one needs to be true for that intent to be fulfill.
Angie: Yeah, that's, that's a good, good exercise. So I, I think the other caveat that we probably, we can definitely do that exercise together, but I think the other thing that we probably can add a constraint is like, what type of business are we in?
Like, are we an NFT project? Or like literally the only goal is to sell, you know, it's just like, it's gonna be a wrong pull. And so that's, that's the objective or. Are we, um, I don't know, like, are we like a data based business? Are we, uh, a project that's like trying to experiment with like different lines of business?
Kerman: I feel like the viewers of this won't be people who are into rock pools. Uh, so yeah, and we don't
Angie: wanna like give them more frameworks to to support the MLM,
what are the steps you can take to really optimize getting that, that user. Um, I should caveat, like I have a neuroscience degree, so I have spent like about 15 plus years studying the human brain and behavior. So I feel like I have slightly an advantage because I can, in my head, I go through like every type of internal, like framework of how do people learn and reason.
Yeah. So I think it might just that exercise might be a little bit easier for me in my like tree diagram.
Kerman: so, okay. Let's choose one, one that's actually relevant to us and something. I've been thinking about is like, uh, let's say it's a borrowing product. Like mm-hmm, , uh, like a, like you lend and borrow money in some sort of like defi product on some chain and it's like, we wanna grow our community.
Angie: Um, yeah. So then for the borrowing product, like what are some of the most levers that are most important to the specific product? So like, do you need liquidity? Do you need capital? Do you need to attract like, there's like specific like thresholds that you need to meet and then that's how we can kind of work backwards to
Kerman: optimize for.
Yeah. So I think in a borrowing product, or like any credit facility, typically you find like, there's, it's a marketplace and there's one side of the marketplace. It needs to be stronger. And that's typically borrowers, cuz like those are the people who are risking money to do something with their money.
Um that's so you want to focus on the demand side rather than the supply side? Cause it's easy finding a lot of people who are willing to lend $10 at 10%.
Angie: Interesting. Well, I mean, you introduced a new variable, which is the, the yield and that , that is interesting as well. But for instance, like, let's say you wanna find people like the, the yield is like another lever that attracts why people,
Kerman: I think maybe it's, uh, This, this, but
Angie: this kind of thinking is like important.
Like this is like part of the exercise, cause it's not gonna be linear. Like, and honestly it's literally just, um, an opportunity to like question and really understand your business so well that you can say, if I pull this lever, like let's say I grow specifically my usership, which means like people using the product.
And then I do a little bit of market research and I find out that, you know, the majority of people. The market cap of crypto, you know, where are the people, where are the users as well? Like let's say most of the users happen to be in India or in Africa, then you can tailor literally your whole community strategy to like the hours that you operate, the hours that you host events to, like the hours that you post your tweets, like you just shift your whole product from like a north American view to an audience that it relates with.
And I find that that's pretty sufficient to at least like, get you to the next.
there's this really good book that I used to read quite a bit, it's called like just enough research. And so this woman is like a user researcher and her life's work is around understanding people and products mm-hmm . And so she talks about that getting enough data, like you don't need thousands or like million data points, you know, 10 to 20 people is sufficient. To just get you to a little step further.
So for instance, like you can run mini experiments for your community for borrowing to see, to assess your users. Maybe for like two weeks, you only post tweets or like your marketing plan is within the time zones or timeframes. That seem to be optimal, like from five to like 9:00 PM, like time in Asia.
and then you see engagement, you know, if engagement is high, because , like, I'm not, I don't think I'm suggesting anything relatively crazy.
Kerman: Yeah. I, I think it's, philosophy of like thinking of everything in systems fundamentally. And like, when you look at me as a system, first, like you need to be able to see.
The system itself. Like most people don't see the system. And once you see the system, it's like, how do you collect information from the system then? How do you systematically organize it then? How do you build feedback loops to optimize the system to where you want it to go? And it's like, you can use this framework for quite literally anything, but it's takes a very particular kind of thinking to like get
Yeah. Let's talk about the first bit that you mentioned the, see the system, like there's. Tools that are, whether they're design thinking tools or I'm sure there's other methodologies for like seeing the system. So like I use like a stakeholder map and I can do it in my head mm-hmm , but I basically draw all the people on a piece of paper and I can now do it like visually, mentally.
And I like draw the relationships between all the people. And that's kind of what I do when I first understand a business or even. When I like join a new company, I'm like, who are the people? What are you all responsible for? And that is like, good enough for me to know how to do my job and like who to interact with for what?
Kerman: Well, for me seeing the system, I think it comes down to like actually more of a qualitative approach of like going to people or books or articles. Who can see the system and taking the view that I know nothing about this, and let me learn from their collective wisdom and experience of how they see the system and then find commonalities of how a magnitude of different people view a system to create the most.
factually accurate representation of that system. So it's like, oh, you're a
Angie: data gatherer bucket. , that's, you're a data. Like, let me gather all the data, the assumption. I know nothing. I'm very much of like, I know 10% let's just move forward. I
Kerman: assume I'm like pretty dumb and I don't really know much.
And like, I somehow just like make it through life sometimes.
Angie: that is such a great philosophy. Like I am, you know, honestly, and I can think that's so great because that's like the perfect way to actually always be humbled in learning. And I find that the moment that you presume you understand people very well or understand the system, like, I think like that system is quite well.
And honestly, that's what you're taught to do in consulting is like research talked to a bunch of experts in a very short burst of time. So like within one. and you like cap your research, gathering very a little bit and you like fine tune your data sources. So you can do like primary, secondary, et cetera.
Kerman: Yeah. I have the someone who I'll probably bring on here. Um, and his two values that he lives by is like, Empathy and humility, he's the co-founder of wire. Um, we hang out and chat a lot. He, I have some phenomenal conversations with him.
I, I really wanna bring him on air. In fact, I bought like a lap, two lapel microphones itself. So the next time we go on a walk, we can record that conversation, but,
Angie: oh, that's so cool. That's a great idea.
Kerman: Yeah, it's so simple. Just like get to wireless microphones. If you feel like you're having a great C like, Hey, we're going into a vortex.
Here's the mic . Um, but yeah, it's like humility and empathy. So humility is like the acceptance to hear what other people are saying and then empathy to feel what's come inside your airs fundamentally. And with that model, you. Create a better framework of like understanding how the world actually works.
Cause mm-hmm, like, we think we know how the world works and we have assumptions. And then it's like, until you actually get the dot on, like, no, here's how it actually works. Like you're, you're just playing the wrong game and it's happened so many times that even what I, I was like, oh, I think. Reality looks this way.
And then I get data points that completely contradict the way I viewed reality. I'm like, holy shit. I'm so dumb consistently. So I just like, I've just had this beat outta me. And I'm like, just like actually not that intelligent at all, about a lot of things. So
Angie: that's so no, I think that's like a very healthy way to assess life.
And also that perspective. I always tell myself that the world is made up and nothing is really just like everything. Like the rules are forever changing. So like, don't be too serious, you know, don't get too emotionally bought in to every type of situation, interaction like outcome. I really like. Yeah.
Like I really like that. Person's like philosophy of like identifying humility and empathy. Cause, um, I don't know. Maybe you've noticed this, but I like something that I see happen quite a bit in society is that the Delta between people's perception of like how they want reality to be versus how it is even, let's say two people, sometimes I'm on a calls and then there's two people having a conversation and they're sort of disagreeing, but they're both actually saying the same thing, but it's.
Both people aren't really taking the time to like, feel, you know, like, let me just like, take a breath, like, let me just listen. What is happening?
Kerman: yeah. And I think what's really dangerous is like technology fundamentally allows for hyper scaled intentions. And so it's like when you have in crypto, I find they're like, there's a lot of money at stake and money is just energy and like intentions are energy.
So you have like these hyper scaled intentions in the form of money, which is a type of energy. People can very viscerally see, and it's like, there's, their intentions may be the same, but they're communicated in. The wrong ways. And that creates like destruction at scale then. And because of that disharmony, it's like, now you have like these two hyper leverage energy sources going against each other, but really they're agreeing with each other.
But the intentions, it's like, they're actually on the same side, but because the tools of amplification are so large, that one degree difference is now. or on different sides. Um, what's
Angie: an example of that so that I can,
Kerman: yeah. So, I mean, like, let's say something with the, terror example I think there's two sides.
People who believed that. It was clearly unsustainable and calling it out and then there's people, well, there's many sides, but I would say there's another faction of people who believed in, what Tara was going for and thought that like, actually it could work out, but there's some problems in the system.
Like they both fundamentally agree that there's a problem and they have an intention to solve. But because like, there's that one degree difference, plus like hyper scaled technology for those intentions. It's now I'm going to defend Tara at all costs and hero all the haters and I'm gonna shut down all the haters and oh, all these people are stupid because they're just like brainwashed into thinking that Tara's great.
But like their intentions actually come from the same place, cuz they're both trying. Fix or call out a problem, but because there's so much money in like at stake and because like on, so on a medium of Twitter, which is like hyper scaled, that one degree in intention difference is now like we're on different sides and we're gonna kill each other.
Angie: Yeah. I think that's, you you've touched on, on such an important, observation about, humans in particular. it reminds me of the point that I think Malcolm Gladwell made in the book talking to strangers, which is that slight miscommunication that leads to these very drastic outcomes and results that people didn't really anticipate.
And so, you know, in the topic of intention and where we see like technology magnifying it and like, you know, humans, aren't perfect where these like emotionally wound up species that you eat lunch and all of a sudden, you know, just Stubb your toe can put us in a bad mood and can, can cause to these like drastic, um, decisions that we make just based off of something as trivial as eating something that we didn't like that disagreed with us or.
Just being, waking up on the wrong side of the bed. And so now when you introduce technology gets magnified and now you introduce money, that's also another mass of amplifier because that's people's livelihood. Like, so like what do you perceive to be, I, I don't think that there's a solution, but I wonder, like, how do you think about being more intentional in the business that you operate and.
Kerman: Yeah, I think this is where I think collectively is a crypto space. There's like this huge identity crisis where like there's been misaligned intentions, of how do I get rich quick versus building a product and the intention of getting rich quick, ultimately manifests. It's it's actually easier to get rich quick.
Than it is to build something good and useful. Everyone's playing this game of like, how do we get rich quickly? problem is we kind of actually advance forward I like to think of crypto as like frontier society. Mm-hmm and we're stuck. Like if you wanna like understand how humans will operate in the future, basically looked at, look at crypto Twitter.
It's like all there. . Yeah. so operating in like frontier society then, but we're all just playing the same recursive game of like how to get rich quick, which is like extract as much value as you can as quickly as possible before anyone else can. We're like actually gonna be stuck as the frontier.
And what's even worse is we have a negative perception to the rest of society, which is like, oh, these people just like, like to make money quickly and like scam people. And it's like, mm-hmm oh, like that's so far from the truth. Like once you get to crypto, it is like the most, it's like the best thing that ever happened to you hands down.
Like I appreciate like what this industry has given me and how it's impacted my life. And I'm just. Being able to scale that to everyone else. And the rest of society is like a gift that we can give. But if we're stuck playing these games, then, and we're not creating real value or trying to have the intention of creating something that matters, like there's a really large systemic, problem here.
Angie: As you're chatting, I've been writing down, you know, I have like a little chart that says like incentives and intention and so I'm wondering, cause I told you about like my stakeholder map. So incentives, like what are some of the incentives that we've seen? Some of the things that I have written down are in the ecosystem.
Like if we think about like crypto as this like ecosystem mm-hmm and like incentives, driving behavior from my perspective are, you know, airdrops tokens. Yeah. Um, FOMO like the desire to participate. So that's like an intrinsic, like societal, like behavioral need. Yeah. Um, personality, the idea to be like, to be different.
You know, a lot of the people originally like into like Bitcoin earlier on, they felt. On the outskirts of society. And so like that became part of their identity. And so like, what are some other incentives that you think contribute to the complexity of this like
Kerman: ecosystem liquidity, and who gets liquidity?
When is the biggest one that I have like grips with investors want liquidity in less than a year. Teams are expected to have liquidity within four years and users want liquidity today. And it's like, yeah. What the fuck? That does not make any sense? Like this is so fundamentally broken, like to anyone who's ever created anything of value.
Like the first year is just you trying to see what the hell is going on. Mm-hmm and then the second year is like, Oh, okay. Now that we can actually see, maybe we can create something that like solves this thing. And then the third year is like, ah, now we can actually do more of the thing that we learned in like year two.
And by year four, it's like, congratulations. You've now like actually created value for society. You're pulling in a reward in the form of like monetary energy. Um, but that's typically how value gets created and like distributed into the world. But what happens right now with the incentives? You basically project to where you're gonna be in four years, people will price you where you are at four years, and then investors have liquidity of where you're gonna be in four years today.
And they've invested a point much lower than where, where it is today. And they will basically sell the today in exchange for the future. The price goes down. People get disillusioned, cuz they think the value has gone down, but price and value are not the same thing. And most people think price equals value. So if price goes down, value goes down, morale goes down. But like all of this has gone down, but you still have to build value up here. So it's. You just end up with these like continual death spirals, where any form of innovation is gonna get completely gutted, because like you have these like huge systemic systemic issues with the incentives playing out in real time.
Angie: like a plus . cause I think, you know, like there, the concept that you mentioned and even like this like death spiral, I think is also very native in public companies. So like companies that have stocks. Correct. And I, one of the analogies that I use navigating the crypto world is I very much treat each project as.
It was operating as a publicly, as a public company with shareholders acting as like token owners. And then I think about the same values and properties, because it's the same thing. It's like valuation, what it's worth the price of the stock decreasing because somebody shorted the stock, then all of a sudden you lose trust.
It's like fragile. We were like a fragile human ecosystem. um, so kind of coming back to some of those things that you articulated, you know, investors want their money within one year users today and employees within four years. Mm-hmm, just based off of that. I feel like there are probably whether this like legal frameworks or even how people are thinking about airdrops like, there's like ways to tweak with those three things that you had articulated, like those little.
Those concepts mm-hmm for instance, like in my mind, you know, with employees, if the liquidity that you're referring to is like your equity vesting within four years, is, is that yep. Correct. Then I think that there's models where you can, you don't have to make equity vest over four years, or maybe you actually compensate your employees an appropriate salary that's market rate, not startup rate.
and that you knowing that the percentage of that equity might mean absolutely nothing. And especially if you have, oh my gosh, we should do a whole episode on equity, cuz like I've been disillusioned with equity and I do not think people understand equity yes. Of fun. And I think it's just like, I feel like it's like the most fundamental concept to understand they should teach in high school.
Literally shapes the decisions in the careers that you make and people are forever chasing these pipe dreams that they'll never really get that payout. Um, separate ran , but like for employees, like, you know, you can minimize like the four year gap for users. I think based just as kind of an outsider, cuz I've been relatively new to the crypto space.
I feel that air dropping has had harmful implications for both projects and the user base. Agreed because it doesn't give projects sufficient data to understand product market fit. Correct. Cause all of their users are using them for all of the wrong motivations.
Kerman: Yeah. Completely. And that's it incentives misaligned with intentions and it's like, what's.
I think just baffles me and just like I get so frustrated, we have the most powerful, expressive financial technology available in the past 2000 years of human civilization. And all people can think about is like how to extract short term value. Like. Do you not understand? This is a multithousand that's
Angie: like a human.
Oh my gosh. You're you're now trying to like, literally solve for how, the way that our brains are designed
Kerman: every day. Wake up. I tell the teams every time is like here's a once in a multi-thousand year opportunity. And like, if your extent of viewing, this is like, how do I get the latest airdrop? How do I pump my token? It's like, I'm sorry, you're just, you're stupid. Like there's no other way put it. That's just
Angie: strong sentiment.
All right. I'm gonna kind of like maybe massage that sentiment a little bit. And the reason that I say that is cuz like I try to like go back to that empathy concept. Like I have so much empathy for. The user base. Right. And so we talked about like the world as we like logically should be, cuz you're right.
Like the logical model that you're proposing is like rational and objective. But like the fundamental thing that you need to understand about people is that they are so irrational. Exactly. And as such, you have to design a business that irrationality can and like that logical and like irrational can somehow.
Reach that equilibrium
Angie: go back to incentives. Right.
Kerman: You know that we're irrational and it's like, you know what should be true? And it's like, incentives are the glue that bridge the two.
Angie: Yeah, that is that's very, that's accurate. Um, one of the things that I did earlier on into our projects for community, when I wanted to like, figure out who potential master programs is, I created the survey.
Yeah. And. I wanted to incentivize kind behavior. And so one of the questions that I asked on the survey, and we didn't end up using it, but I was just astonished with how many people populated this. So it was one of the questions was, you know, what is something nice that you've done for another person without expecting anything in return?
Huh. And so 4,000 users filled out that survey. Wow. Is crazy. And. Part of, it was like so beautiful to humanity. And then other part was like, oh no, like I understand why the world is the way that it is. And so it's interesting how people perceive. And so, like the main thing that I was trying to go after was I wanted to incentivize the behavior, people that felt that they wanted to help others.
For genuine reasons, because if you're kind to one person, that person will be kind to another, if you are like, if you go punch another person that they're gonna like, that cycles of trauma can be repeated. Yeah. And so I thought about like, how do you incentivize people helping other people? Well, you try to look at the behaviors that you value.
And so if the value system is like rational belief and thought, then you try to like find mechanisms to give these people opportunities, to amplify their specific message. . And so when we talk about incentives of like users, you know, what do you think projects should do is the, is it, is it in the context of like air dropping is in the context of, you know, how do you incentivize, like what you perceive as positive behavior and, you know, and detract from the other kind of stuff.
Kerman: Yeah. Well, I think fundamentally, like what is the intention of the project itself? cause that's the first place is like, is this project trying to like extract money as quickly as possible? And like, if that's a case, then you should yield farm, you should airdrop and you should do exactly what you do today in crypto mm-hmm
Um, So like, it's like, that's kind of an answered question. Um, but it's like,
Angie: that's such a big ask for people that were like self-awareness is like the minority of the world
Kerman: and sometimes people know, or they're just like, they're like not sure. Maybe it's like their intentions are guided by, based on like how hard they think it is.
Um, so it's like, if you can show, well, Hey, if your intention is this and like, here's the pathway to that. Then it makes people comfortable. And I think like having empathy for around, I'd say the current state, unfortunately, the way investors are thinking is creates an incentive where it's easier to have your intention to like make money quickly than it is.
uh, to create long-term value. So the incentives like rationally, as we said, once in a 2000 year opportunity should focus on long-term value, but like short term incentive and what everyone else is doing, actually do this instead. Mm-hmm so if we can design better incentives, we can actually like, make it easier and more fun to like play long term games with each other.
But I think one problem is there, like the tools to play long term games don't actually exist right now. Or like people haven't recognized that they can exist fundamentally. Like we think about just to make this more tangible, like an Aird draw and the air draw is like, it's your it's like, Hey, we are distributing the cap table to users.
Um, but it's that then thinking like, well, which users do we want? Why do we want these users? How are these users going to add value? like, what are the expectations we wanna set? And as soon as your intentions are like more clear, you can then start to craft. And ask better questions.
Like I think I love the optimism airdrop for that. Very reason you can tell their intentions are not to get rich quick because they're like this airdrop in a very thoughtful way. We're gonna have like an actual team to like take care of this. And you're like, okay, these guys are here for the long run.
Um, and like, I think for projects they're the ones that set the tone that the users will then follow through cause users at the end of the day, like they've been bought into the promise of crypto, um, and like having empathy of like, well, here's this new magical, exciting place they're gonna learn the rules that give them the most fun money and status fundamentally.
So as projects, projects are the incentive designers. And investors are the people that fuel the fire that the projects decide on.
Angie: it, you really like highlight the, the being thoughtful, the being like very intentional, you know, like parsing through your data.
So for example, like the optimism team, like I've met their team and they're exceptionally thoughtful about the decisions that they make. They use a ton of data when. and, you know, I'm not surprised that you reference them. and so I think like, like, yeah, like you're, you're definitely right. And like, where are more of those people?
Um, even though I've seen like crypto Twitter, I telling someone, I was like, people just always seem so angry. Like no matter what, like, but the op drop it. Wasn't perfect. Um, I was like, all right, well, you can't like just please the world.
Kerman: Indeed. But anyway, thanks a lot, Angie.
Kerman Kohli is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.