“You say you want a revolution // Well, you know, we’d all love to change the world” - The Beatles, 1968
Finance isn’t everything. Even if fully performant, zero-knowledge, trustless (i.e., permissionless and censorship resistant), low-cost, low-latency and Turing-complete computer networks are assumed, there are many other pieces of the puzzle beyond the network-native financial stack needed to facilitate a full-fledged trustless economy (e.g., wallet recovery processes, lots of middleware, hosting).
However, as a part of the whole, DeFi’s role is clear: allow anyone, anywhere, anytime to conduct arbitrary financial activity, without the need for a centralized intermediary. Decentralized Finance is an apropos name. DeFi promises the ability to permissionlessly compose arbitrary trades and loans, and offers to trade and lend, with customizable counterparty risk. Everything else is outside the scope of DeFi.
Despite its promise and potential, DeFi has failed to facilitate real economic activity (no, yield farming doesn’t count). With the exception of counterpartyless leverage (e.g., LUSD, GMX), state of the art DeFi protocols are dramatically uncompetitive with centralized alternatives in terms of capabilities, cost to use and confidentiality. In order to compete, DeFi will need to offer alternatives to CeFi without compromising on trustlessness at any point in a transaction.
The lack of progress, collective organization, customizability and composability of DeFi to date is greatly concerning given the amount of investment into the space. The disunity of current DeFi development is primarily a result of lack of consensus around how to fund open source code development, particularly the appropriate roles for tokens in the funding of development and maintenance of protocols that create composable censorship resistant networks.
Below are thoughts on where DeFi is, where it is going, why current token models have caused progress to be so slow and how to collectively move faster.
“I’ve never seen so many men wasted so bad.” - Man With No Name, The Good, the Bad and the Ugly, 1966
The Good: Over the past few years, scaling solutions and zero knowledge technology have made significant strides forward in the capabilities and capacity of trustless computer networks. Zero-knowledge, trustless, low-cost, low-latency and Turing-complete computer networks able to compete with centralized alternatives appear fast approaching. The potential effects on society of such networks remain as infinite and exciting as ever.
The Bad: The capabilities of smart contracts have failed to make using trustless computer networks for any real economic activity practical. Trustlessness is a least-common-denominator problem: a system is only as trustless as its least trustless component. Even if networks and all of their associated infrastructure are permissionless and censorship resistant, they will continue failing to attract any real economic activity until they can be used in ways that are competitively performant with centralized alternatives without compromising on trustlessness at any point in the process. To date, companies that build on trustless primitives cannot compete with those built on centralized ones.
The Ugly: The current development ethos of DeFi protocols is unlikely to create the utility necessary to make using trustless networks for real economic activity practical any time soon. For a community that is supposed to champion trustlessness and free market competition, with so much to do and so little accomplished, there are an awful lot of fences being put up. Uniswap and Curve are publishing code under restrictive licenses. Curve is making trustless interoperability harder, not easier. Leading borrowing and lending protocols are not permissionless. OpenSea hasn’t open sourced their off-chain order book nor indicated any intention to do so, and has in fact begun censoring it.
Lack of consensus (or even discussion) on the best way to develop composable financial primitives, particularly with regard to the roles of tokens in networks and facilitating sustainable economic models for development and maintenance of them, has greatly stagnated development and innovation, and enabled scams and frauds.
“They are one person // they are two alone // they are three together // they are for each other” - Crosby, Stills & Nash, 1969
Every financial transaction between separate parties is an exchange. How and what to exchange are DeFi’s pieces of the puzzle to solve. Trades and loans are the primary exchanges currently happening off-chain that need to be competitively facilitated on-chain. While it is harder to imagine on-chain trading experiences directly competing with off-chain ones in a vacuum due to the cost of updating orders, it is easier to see how the transparency and composability of on-chain lending present a dramatically better alternative to centralized solutions. The integration of composable trading and lending protocols is a prerequisite to unlocking DeFi’s potential.
There are three interrelated pieces of the DeFi stack: (1) protocols to exchange things, (2) protocols to create arbitrary things to exchange and (3) protocols that integrate (1) and (2). Until each piece of the DeFi stack offers a minimum set of functionality competitive with off-chain financial systems, DeFi will fail to attract off-chain real economic activity to move to a trustless tech stack.
While each of the three components may never be finished, they can be architected with the other pieces in mind to maximize the composability and security of current implementations while minimizing friction and risk to implementing improvements.
The current state of decentralized trading and lending protocols are a microcosm of the problematic ethos of DeFi development to date. Instead of using smart contracts to create new, flexible and composable trading technology that better facilitates arbitrary economic behavior for all users because it takes advantage of the inherent benefits of a provably fair permissionless execution environment, not in-spite of it, developers are stuffing users into stiff and non-composable protocols that do not solve real problems and prioritize rent collection by developers and investors at the expense of utility, composability and innovation.
“You can’t always get what you want // but if you try sometimes // you’ll find // you get what you need” - The Rolling Stones, 1969
In theory, DEXs can create a radically better UX than centralized alternatives for takers by allowing anyone to trade/borrow anything at anytime against/from every other market participant at once, at the best price, for the cost of gas, and for makers by allowing entire trading strategies to be encoded on-chain that reduce toxic flow received to zero in a single order. In practice, modern DEXs are not profitable for market making, impose a misguided mandatory fee on takers, limit tradable/borrowable assets and asset types and lack composability with other network-native protocols.
Network effects on exchanges are incredibly powerful; liquidity begets more liquidity. All exchange flow is either price discovery (i.e., non-toxic flow) or arbitrage (i.e., toxic flow). Market makers are competing to maximize how much non-toxic flow they receive and toxic flow they send while minimizing how much non-toxic flow they send and toxic flow they receive. Exchanges are competing to attract market makers to create liquidity that attracts other users (i.e., non-toxic-flow). More liquidity —> more users —> more non-toxic-flow —> more potential profit for market makers —> more market makers —> more liquidity —> etc. Without the ability for market makers to profitably provide liquidity, DeFi cannot work.
The inability to update orders on permissionless and censorship resistant networks to avoid toxic flow presents the primary theoretical issue for decentralized exchanges to compete with centralized exchange experiences created for market makers that will never be solved by improvements to the networks themselves. The nature of the execution environment (i.e., MEV) in practice forces market makers in DEXs to forgo the ability to update their orders to avoid toxic flow. Therefore, even if updates could be performed without cost (which they cannot), the orders will never be updated in time to prevent them from receiving toxic flow. The result is that DEXs consistently receive the most toxic flow. Large losses for providing liquidity on DEXs disincentivizes market making which leads to low liquidity and worse prices for traders.
The inability to update DEX orders to avoid toxic flow makes facilitating profitable market making on-chain quite difficult. Current solutions to the problem are even less promising than they are composable, and can be summarized as accepting all toxic flow and attempting to cover it up with trading fees imposed on takers. Fees paid to makers don’t cover the losses from the toxic flow and create friction in price efficiency for takers and oracles. With the exception of Curve V2, there has not even been an attempt to reduce toxic order flow accepted by DEXs.
Over the past few months we have been working with other members of the Beanstalk DAO on the development of a new DEX that will facilitate (1) the creation of arbitrary orders that can encode sophisticated trading strategies on-chain, such that the only time an order must be updated is if the trading strategy itself is being changed and (2) use of the liquidity in orders in other network-native protocols in a composable and customizable fashion. You can read more about DEX developments here.
“If he should break his day, what should I gain // By the exaction of the forfeiture? // A pound of man's flesh” - Shylock, The Merchant of Venice, 1596-1598
Unlike the exchange, there are no theoretically unsolvable problems created by the network layer that network-native lending protocols must work around to create user experiences competitive with centralized alternatives. Permissionless borrowing and lending sticks out as something DeFi should be competitive with CeFi given the ability to codify every term of the loan in a provably enforceable smart contract. Moreover, the ability to codify counterparty risk and downside limits for each loan presents the potential to radically outcompete CeFi.
In some respect, every smart contract that holds value and mints tokens (or doesn’t) with some claim to the value held by the token (or an account), is a lending contract where the token minted (or account balance) is a representation of the loan to the contract. The terms of the loan and accepted collateral are encoded in the contract that issues the token. This definition fits wrapper contracts and liquidity pool contracts just as it does borrowing and lending or synthetic asset contracts. In every case, some token that has some value according to the contract that issues it is redeemable according to the rules of the contract. Loan terms and acceptable collateral are two somewhat independent parts of a loan.
Despite their seemingly infinite applications, the flexibility and utility of current smart contracts that facilitate the creation of loans are as lacking and non-composable as the DEXs that support trading the loans. To some extent, this seems downstream of the fact that there are no composable exchange primitives that make it easy for loan creation protocols to exist on their own. Instead, current DeFi lending protocols have had to implement their own versions of popular exchange protocols with no additional functionality. However, the total lack of customizability and composability in loan terms and acceptable collateral is evidence of the same ethos plaguing decentralized exchange development.
Loan terms can be arbitrary and modular sets of fee structures (e.g., fixed rates, variable rates, 2/20 fund structure), settlement structures (e.g., periodically, conditionally, distributions, redistributions), redemption policies, acceptable uses of capital in lending contracts, acceptable downside limits (e.g., loss of value by the lending contract, duration mismatch) and liquidation guarantees. Similarly, collateral terms can codify sets of acceptable collateral to use to borrow from the contract. A sufficiently composable architecture means that neither the implementation of a loan nor collateral term (or type of term) need to be repeated.
We have also been working on an early implementation of a Loan Generation Factory (LGF) (v0 under audit) that will eventually be able to facilitate the creation of arbitrary smart contracts that issue arbitrary loans (tokens) against arbitrary collateral, such that (1) the collateral in the loan can be redeemed by the token holder according to the terms of the loan encoded in the contract that issues the token and (2) loan terms and collateral terms are bifurcated and composable, which allows for both to increase in sophistication independently and in parallel. You can read more about LGF developments here.
“There’s this huge space between us, and it just keeps filling up with everything that we don’t say to each other.” - Mr. and Mrs. Smith, 2005
Composing a liquidation, obligation and hedging engine can theoretically create a financial system with infinite capital efficiency (i.e., a 0% reserve ratio) without a lender of last resort.
In order for loans to be issued against collateral that is not denominated in the same unit as the obligation, the loan contract needs a way to evaluate the value of collateral in relation to the obligation. Capital efficiency in network-native loans issued against collateral that is not the same unit as the obligations is dependent on the ability for the issuing contract to liquidate or hedge collateral in order to prevent it from taking on bad debt. Capital efficiency for loans is the ratio between the value of outstanding loans compared to the collateral used to issue them, and the reciprocal of the reserve ratio. Whereas modern banks run on a 0% reserve ratio, which theoretically enables infinite capital efficiency, with a lender of last resort (i.e., central banks), trustless lending protocols are all highly capital inefficient because the loans they issue are significantly overcollateralized.
The only reason for limits in capital efficiency of trustless lending protocols is because of the potential for liquidation conditions to change before a liquidation can take place. The fact that network-native DEXs do not lend themselves to integrations with lending protocols creates massive capital inefficiency because lending protocols are unable to ensure that liquidation conditions will not change such that the contract accrues bad debt, and therefore require massive liquidity buffers.
The complete integration of trustless and composable DEXs and LGFs to guarantee the prevention of accrual of bad debt by the contract that issued the loan has the potential to create a competitively capital efficient trustless financial system to those backed by lenders of last resort. In theory, the value a given loan contract can ascribe to given collateral is the maximum of (1) its liquidation value on-chain, (2) its obligation value and (3) the maximum liquidation value of its obligations. (1) can be determined by a liquidation engine that integrates DEXs and LGFs through stop-loss orders. (2) can be determined by an obligation engine that integrates LGFs with themselves. (3) can be determined by a hedging engine that integrates (1) and (2).
Despite the potential for dramatically improved capital efficiency from the integration of trading and lending protocols, such integrations remain almost nonexistent. Curve’s LLAMMA is the first application of such an integration. However, in line with the MO for DeFi development, the LLAMMA is only usable on Curve’s AMMs to borrow a single synthetic asset.
The only thing necessary to implement a liquidation engine that facilitates the integration of a DEX with LGFs is for the DEX to support stop-loss orders. Loan contracts can place stop-loss orders upon issuance of a loan that ensures the collateral can be liquidated without the accrual of bad debt to the contract. Such an integration can increase capital efficiency and reduce the reserve ratio to 100%.
An obligation engine that creates the ability for loans that are guaranteed not to accrue bad debt to be used as collateral for other loans can reduce the practical reserve ratio of permissionless lending protocols below 100%. To do so, an interface that allows loan contracts to communicate information about their liquidation guarantees and obligations associated with tokens to one another is needed. Each loan contract can return the minimum return values and denominations over time guaranteed by ownership of tokens or sets of tokens it issues. The calculation of minimum return values and denominations can be composed exclusively from the loan terms, such that contracts that reuse loan terms can also reuse associated logic for the obligation engine.
A most basic example of the capital efficiency created by such an obligation engine would be the use of an ETH future payable at block X as collateral to issue another ETH future payable sometime after block X.
The most sophisticated piece of this puzzle is the development of a hedging engine that allows for loan contracts to communicate with one another about how to hedge the obligations of tokens they issue using other tokens. Instead of liquidating the collateral for the obligation itself, lending protocols can hedge their exposure using any of the other loans that have obligations that are greater than or equal to its obligations, with less than or equal duration. The arbitrary nature of loans adds significant complexity to the problem. However, a hedging engine that enables autonomous hedging in the theoretically most efficient fashion will facilitate a radically and iteratively more capital efficient, less risky financial system than ones built on centralized primitives.
An example would be to liquidate collateral used to issue an ETH future payable at block X for any combination of ETH or any ETH futures that are payable before block X. As long as any combination of ETH or any ETH futures that are payable before block X can be bought to hedge the exposure of the contract, the collateral need not be liquidated. Similar to the obligation engine, because uses in hedges can be derived exclusively from loan terms, the composition of abstract return values from loan terms into acceptable hedges can be implemented once and used forever.
Instead of loan contracts placing a single stop-loss, they can place sets of stop losses that each hedge the loan exposure through a different set of assets. Liquidation then becomes a greatest common denominator problem where the loan only needs to be liquidated if every set of hedges will not succeed. Furthermore, the iterative nature of such a system, whereby every piece of an obligation set can itself be hedged through sets of stop losses, makes the only theoretical limit to capital efficiency network costs.
While its potential is exciting, almost none of the DeFi stack described above is ready for showtime. In the grand scheme of things, this is not so bad. Neither are scalability solutions, zero-knowledge proofs, decentralized identity systems, indexing protocols and other necessary pieces of an entirely censorship resistant tech stack that can facilitate user experiences competitive with current centralized ones. While there is massive potential for the DeFi stack to revolutionize finance, there is also a massive amount of work to be done. With a properly designed architecture that maximizes composability and minimizes the code that needs to be rewritten, DeFi can avoid being the LCD to adoption of a trustless tech stack.
“You are what your record says you are.” - Bill Parcells
Tokens can be almost anything, such that making accurate generalizations about them is quite difficult. Despite (or, perhaps because of) their flexibility, there is little understanding of the proper role of tokens in the development and maintenance of trustless networks. There used to be much discussion about where value would accrue within the censorship resistant tech stack (i.e., fat/skinny protocols), how to best fund the development of censorship resistant tech, and the role of tokens in aligning value capture and funding of development and maintenance. Today, there is much data from experimentation over the previous five years to look back on to complement the theoretical discussion, which has yet to be integrated into protocol designs and funding models. Even though the need for a token is questionable at best and financial performance of almost every token to date has been horrendous, they remain the most popular way to align incentives between investors and developers.
The source of value of tokens can be classified (imperfectly) into monetary (i.e., store of value, medium of exchange, unit of account) and financial (e.g., equity, debt) components. Whereas the financial value of assets derives from some future expected payment valued in money, monetary value derives from the properties of the asset itself. Tokens are so flexible that their value may derive from a mix of monetary and financial properties. The fact that financial assets can be used as monetary assets (e.g., bank deposit slips) does not help to simplify the issue.
Another helpful, albeit imperfect, axis on which to classify tokens is collateralization: collateralized tokens derive their value exogenously (i.e., from demand for other tokens or off-chain assets), whereas non-collateralized tokens derive their value endogenously (i.e., from demand for them or what they entitle holders to). Every token that can exist must be some combination of exogenous and endogenous value. Because exogenous value is derived from other tokens that are themselves combinations of exogenous and endogenous value, value can be reduced entirely to endogenous monetary and financial value.
Despite the clear importance of endogenous value tokens, there remains little clarity on how protocols can, and whether they should, create endogenous value for a token. The lack of clarity on the topic is having a negative effect: the leading DeFi protocols almost all currently employ useless (or worse) tokens and some are even publishing code under restrictive licenses.
Developers thinking about designing protocols that issue endogenous value tokens should have a clear understanding of which sources of value will contribute to the value of the tokens.
“If it ain't about the money // Ain't no use in you ringin' my line // stop wastin' my time” - T.I., About The Money, 2014
At risk of transgressing the limit of this piece to thoughts on DeFi, we comment briefly on decentralized money only to emphasize that in almost every case protocols shouldn’t issue their own money.
Money is the most obvious use for a censorship resistant tech stack. The utility of money is a function of its trustlessness, carrying costs, stability and liquidity. Each component of utility demonstrates strong network effects. Therefore, it is not surprising that monetary networks are winner-take-most in practice. This makes the issuance of tokens designed to function as money a highly competitive market.
To date, only two types of money have differentiated themselves on trustless computer networks: a money optimized to secure the base layer (e.g., BTC, ETH) and a money optimized to facilitate economic activity on top of the base layer (e.g., stablecoins). Currently, there is a fork in design optimization between network security and network utility monies around price stability. Whereas the former have thus far been designed to increase in price, the latter are designed to minimize price volatility. To date, the difference in design optimization makes sense: networks are secured by a single endogenous value token. The higher the value tokens used to secure the network, the more secure it is. However, because stability is one of the key value drivers of currency utility, there has been demand for another type of endogenous value trustless money with stable value.
A censorship resistant network is only secure if the endogenous value secured by it is greater than the endogenous value used to secure it. Therefore, a sufficiently generalized network that allows staking any censorship resistant endogenous value token towards network security is likely to outcompete networks that do not. Censorship resistant computer networks will compete on ability to attract economic activity exclusively on capabilities, costs and censorship resistance. In theory, base layer protocols of censorship resistant networks need not be the issuer of the endogenous value token that ultimately creates the majority of value used in network security, but have a clear first mover advantage and edge over competitors in the form of demand to pay for use of the network.
It is theoretically feasible to combine network security and network utility monies into a single money. Given the strong network effects around monetary networks and the lack of technical friction to adopting a given trustless money, it is not unreasonable to assume they may be combined in the future. However, this boils down to whether one money can serve the role as the best store of value, best medium of exchange and best unit of account at once (which are monetary, not technical, requirements). Thoughts on that question do not belong in this piece. Nonetheless, there is at most room for two endogenous value monies in a given trustless network, making it exceptionally difficult for protocols to issue tokens intended to accrue endogenous value as money. Money issued by protocols that are not explicitly designed to issue money are unlikely to compete with those that are.
“We can do everything straight and still go broke.” Porter Collins, The Big Short, 2015
One complaint frequently lobbied at Ethereum (and other networks that didn’t start as money-first networks) is that the security of the network is dependent on the value of Ether, which itself derives value from use of Ethereum. Allegedly, the circularity in that value cycle jeopardizes the security, and therefore integrity, of the network. Such criticisms ignore two key facts: (1) that any censorship resistant network that secures value must itself be secured by censorship resistant endogenous value greater than the value secured on it and (2) endogenous value tokens can derive value from any combination of financial and monetary features, which can evolve over time.
In order to competitively facilitate economic activity with centralized solutions, censorship resistant computer networks need sources of censorship resistant endogenous value. Creating endogenous value is a chicken and egg problem, and there is no better source of eggs than payment for use of secure networks. Networks that do not create censorship resistant endogenous value tokens cannot solve the chicken and egg problem of bootstrapping network security because they can never secure other value.
However, value that comes from future expected payments is financial value, not monetary value. As John Pfeffer so eloquently put it in ‘An (Institutional) Investor’s Take on Cryptoassets’, “…at mature equilibrium, tokens will do no more than allocate computing resource, with the exception of the special case of a cryptoasset that serves as a monetary store of value.” This can be generalized to say that the endogenous financial value component of tokens will erode to 0, while monetary value is potentially sustainable. In the case of Ethereum, it means that demand for Ether to pay for gas will trend to 0 independent of demand for using Ethereum. It is therefore no surprise that there has been a major focus recently on the monetary policy of Ether.
In summary, competing networks are likely to allow any endogenous value to be staked toward network security. Endogenous value can come from any combination of monetary and financial value, but the financial value extractable by open source protocols trends towards zero over time. Networks that require consensus can, but need not, bootstrap security by issuing an endogenous value token of their own, but over time must be secured by endogenous value that is created separately from demand for use of the network layer. Endogenous value tokens that are not intended to be money will trend towards 0.
“Do. Or do not. There is no try.” - Yoda, The Empire Strikes Back, 1980
Nonetheless, the majority of DeFi (and other) protocols still issue endogenous value tokens. In doing so, they impose a requirement on the protocol to create monetary value (hard) or extract financial value (easy, but very misaligned with open source development). While there are infinite ways to create financial value, it cannot be captured by a token issued by an open source protocol in any sustainable capacity. Two popular attempts at doing so, via fees and governance rights, are particularly indicative of the misguided development ethos in DeFi and to a large extent permissionless development at large.
Imposing fees for using open source code is oxymoronic: fees at the protocol level in excess of costs equate to licensing fees. While objectively harming the relative utility of using the protocol through the imposition of unnecessary costs to users, tokens that accrue some form of fee for use of the protocol that issues it (e.g., CRV, LQTY) have proven to be the most successful way to compensate developers for development. In practice, developers sell fee tokens to investors to fund development, which are often not even liquid until the protocol launches. Unless the token is intended to become money, it is unlikely to retain value for any extended period of time. The early sale of fee tokens, which are often obfuscated behind layers of “tokenomics” and become liquid at cliffs, enable the dumping on retail phenomenon that has become common despite its ethical abyss. Fee tokens do not contribute to the utility of the protocol. The catch-22 is that the way DeFi will compete with CeFi is with lower fees, but if every part of the financial stack imposes fees, it will take a long time for on-chain costs to compete with off-chain costs. DeFi is supposed to remove the middleman for financial transactions, not replace it. Open source protocols cannot continue to be funded through the issuance of fee tokens.
Governance of permissionless technology is almost also an oxymoron: there appear to be very limited instances where permissionless protocols require governance at the protocol layer. Unless there is a need to maintain consensus, the only reason to attempt to prevent forks is to extract the maximum amount of fees from users. In instances where maintaining consensus is necessary, governance tokens can codify strong incentive alignment between misaligned parties due to the collective benefits to all token holders associated with good governance. In instances where networks require endogenous value to be staked to create security, the endogenous value need not be created by the network itself. Therefore, unless the network is attempting to issue endogenous value money, if a token is necessary to coordinate governance, it should be an exogenous value token that is collateralized by endogenous value. Because there can be no expectation of continuous financial value extraction by open source networks, governance tokens should be used exclusively to minimize network fragmentation.
Open source protocols should never be designed to extract financial value. Therefore, open source protocols that are not issuing tokens intended to be (or become) money should never issue an endogenous value token, leaving exogenous value tokens or no token as acceptable options. However, there is no value extractable through the collateralization of other tokens. Endogenous value is value created, whereas exogenous value is value wrapped. Therefore, unless the goal is to issue money, there is (1) no reason for open source protocols to issue a token and (2) no direct financial incentive for development of open source protocols that become wildly used. Herein lies the crisis.
The results of this dilemma are painfully apparent. No one with a financial orientation wants to develop, or fund the development of, tech that they cannot extract rent from. Perhaps this is why DeFi in particular is lagging behind other pieces of the trustless tech stack. The current solution popular amongst investors and developers of dumping fee tokens on retail (that are also paying the unnecessary fees) is ugly, at best. A new path for the development of censorship resistant protocols that better aligns value creation and value capture than dumping on retail is needed.
“Roads?! Where we’re going we don’t need roads!” - Doc, Back to the Future, 1985
In order to best fund censorship resistant protocol development, a clear distinction must be made between the role of open source protocols and companies in a trustless economy. The financial value capture of companies that develop open source protocols cannot happen at the protocol layer. Whereas open source DeFi protocols should facilitate arbitrary P2P trades and loans as efficiently as possible, companies should be using them to create best-in-class user experiences for customers. Whereas open source DeFi protocols should never extract financial value, companies can create financial value without extracting it. The speed of development of the DeFi stack has been and continues to be greatly hampered by a lack of consensus on how to fund companies that adhere to these principles at the cost of the ability to collect rent.
A trustless execution environment with a tech stack that extracts no rent and is competitive with centralized solutions in terms of the user experiences they can create must also facilitate the management of businesses building on a trustless tech stack to be competitive with companies building on centralized ones. This assumes a network-native trustless money competitive with centralized money, but given the radically competitive environment around the creation of trustless money, it is hard to imagine that will not come with time.
However, at the moment, there is not a single type of real economic activity that can be performed in a trustless fashion with competitive UX compared to centralized alternatives (not even transfers). The tragedy of the commons is that in order for it to become possible for any business building on a trustless tech stack to compete with centralized alternatives there is a lot of work to be done. However, once it is done everyone will be able to share equally in the fruits of the work, independent of contribution.
In practice, however, companies contributing to the open source, composable, trustless tech stack have a clear first mover advantage to use the technology to create amazing user experiences that attract customers away from centralized alternatives. Companies making money from creating value for customers through P2P (or B2B or B2P) in a highly competitive environment epitomizes positive sum economic activity and the types of business models that experience sustainable network effects.
The use of open source technology, and the contribution to the open source tech stack by the company itself will enable companies to offer a more scalable, reliable and cost effective product or service, which will enhance the network effects around company loyalty. Companies that demonstrate an ability to consistently create value for users on a trustless tech stack will dominate current competitors. The development costs of new features in a composable tech stack are a fraction of the cost currently spent on closed source systems. The more a company contributes to the open source tech stack, the more opportunities to generate alpha it creates for itself.
The biggest problem is the chasm between current functionality of trustless tech and the necessary functionality in order to facilitate an MVP: any real economic activity that can be competitively facilitated on a totally trustless tech stack. Once on the other side, there will be no doubt in the attractiveness of investing in businesses that have the potential to dominate centralized competitors leveraging a trustless tech stack. However, the companies that are likely to succeed in that environment are the ones that employ the people that built, and continue to build, the open source protocols that they use because the people that build the protocols will be able to use them to their full potential to outcompete others in alpha generation.
The time is ripe to invest in DeFi companies building open source protocols that demonstrate an understanding of (1) the role of DeFi in the bigger picture, (2) the technology necessary to facilitate trustless user experiences that dramatically outcompete trusted ones because of, not in spite of, the execution environment, (3) the pieces of the puzzle the company will contribute to the stack, and (4) the way the company will leverage the protocols they help develop to create great experiences for users and financial value for itself. (4) is the only question that should be answered in private. Everything else should be actively discussed in the open. Companies that fail to satisfactorily answer (1), (2) or (3) should not be taken seriously by the public. Contributions to this discussion are just as important, if not more, than development of the protocols themselves. Companies that fail to satisfactorily answer (4) should not receive funding.
“‘How did you go bankrupt?’ Bill asked. ‘Two ways,’ Mike said. ‘Gradually and then suddenly.’” - Earnest Hemingway, The Sun Also Rises, 1926