Weekly Spotlight: Bitcoin and the ordinals dilemma

Executive Summary

– Bitcoin network usage has increased dramatically in the last two months due to adoption of the ordinals protocol to create Bitcoin-native NFTs.

– This usage has increased fees and is leading to a backlash among Bitcoin developers and community members.

– Our view: excitement around ordinals is likely reaching a natural peak anyway, and we hope that no action is taken.

Bitcoin is currently in the throes of a crisis that, while not the oddest in its litany, certainly does rank up there. The pithy rejoinder here is that the crisis is that the crisis is that it’s finally being used; after years and years of hitting capacity, and finding that fees markets made those capacity limits something of a self-solving problem, the advent of ordinal inscriptions has kickstarted a sudden new period of development on the network that has driven up block sizes, driven up demand, and therefore driven up fees dramatically. Fees per transaction on Monday averaged out to over $30, 15 times the recent norms and not far off the pico-top of that seen at the height of the 2017 and 2021 bull runs:

Via Blockchain.com.

To properly understand what is going on here, we need to make sure that we understand a few different components that have perhaps at this point faded out of the memories of many, given how long it has been since the state of the Bitcoin chain has been something worth thinking about for even those deep in the industry.

Like all blockchains (at least those being honest about both their throughput and about being a decentralised chain), Bitcoin has a certain maximum throughput; the common figure cited here is 7 TPS (transactions per second). This has historically been closer to a theoretical maximum than real-world utilisation (because a more complex transaction will take up more space within the 2 megabyte block than a simple one), and realised throughput has been between 2 and 5 TPS consistently:

Via Blockchain.com.

Notice that since mid-2019, despite price heading up and Bitcoin shifting into an established asset, realised throughput has trended downwards. There are a few things going on here – smarter chain usage (e.g. batching transactions) by exchanges and native companies, expansion of the Lightning Network, smaller speculative activity shifting overwhelmingly onto Ethereum and then onto other chains beyond that – but it has all added up to Bitcoin’s identity shifting from a do-it-all blockchain in its early days, to one with a far more focused identity around the digital gold and currency-of-last-resort narratives.

The grand tonal shift from “you can buy coffee with this!” to “of course you’re not supposed to buy coffee with this!” has worked out quite well for the Bitcoin community. It has created this odd situation where the chain itself is the equivalent of a heritage railway – it’s there for advocates to pull up, point to its beautiful provenance and give you a little ride-along, but all with the full knowledge that it only really works out because it’s simultaneously revered and underutilised.

The problem now presenting Bitcoin is two-fold. First, the developments. While the fact that Bitcoin has maintained its low block sizes and overall simplicity has – justifiably – been a point of pride for developers, the protocol has continued to develop, and saw its biggest step forward since SegWit in 2017 with the implementation of the Taproot upgrade in late 2021 which, among other things, enhanced smart contract functionality on the network.

Despite grand promises about how Taproot would change everything for Bitcoin and how DeFi on Bitcoin was the wave of the future and so on, very little changed for a long time after Taproot came in. Ultimately, building sufficient scripting that directly interacts with the chain isn’t really possible for most DeFi applications even after Taproot, so ‘Bitcoin DeFi’ was always something of a non-starter in that sense; remember that even Ethereum, with its far more frequent blocks and an effective TPS closer to 30, has found itself consistently wanting on capacity.

This is where ordinals come in. Ordinals, in essence, use 1-sat transactions (the lowest unit of Bitcoin transferable – one-hundred millionth of a Bitcoin, or around three-ten thousandths of a cent) and Taproot scripting to create a metadata inscription of an asset on-chain that then essentially functions as a non-fungible token from thereon out.

The idea is, in some sense, not new. Prior to Ethereum’s inception, there were attempts to build protocols for new tokens and the like on-chain, typically through the OP_RETURN function. OP_RETURN allows one to store arbitrary data on-chain for the cost of a transaction; the limit for this data was 10,000 bytes prior to 2013, but was effectively reduced to 40 in March 2014 with Bitcoin Core 0.9.0 (and back up to 80 in 2016) to prevent perceived abuse. This hostility – driven by concerns over block size (large blocks making it harder to host nodes and therefore driving the network towards centralisation) – was part of the reason that focus would shift to Ethereum as the smart contract network in the proceeding years.

Ordinals, as it turns out, would create a perfect storm of attention. There are a few factors here. The first thing to reiterate is that much of this has become possible because transaction costs in general on Bitcoin are – or at least were – so much lower than Ethereum over the last couple of years (having been higher prior to 2020, and maintaining a pseudo-parity relationship through 2020 and 2021) that it became sufficiently attractive to at least play around with. To be clear, we are not talking economically-efficient amounts here – ordinal inscriptions cost anywhere between a few dozen dollars and a couple hundred in a low-fee environment, even if then trading them was much cheaper – but low enough for crypto natives to play around with nonetheless.

The second was that, as we alluded to, ordinals are for all intents and purposes NFTs; and, as anyone at all familiar with the last few years of activity on Ethereum and alternative layer-1s, NFTs are the perfect way to kickstart activity because they have few technical dependencies and have just high enough of a barrier of entry to rope in the typical early adopters to any crypto craze. Other points here include a rapid pace of technical development with regards to wallets and markets (aided by developers coming over from other layer-1s).

In any case, ordinals have gradually taken off, and we are beginning to see other developments come through. Most notably, the ordinals system has now been used to create BRC-20 tokens. The name is a direct nod to ERC-20s (the dominant standard for fungible tokens) on Ethereum; in essence, BRC-20s use ordinals to inscribe metadata on-chain to then create token balances on-chain that can then be read and recognised by compatible wallets.

To be clear: BRC-20s are likely to end up as little more than a gimmick, though notably exchanges are starting to support them (e.g. Gate.io yesterday announced listing of five BRC-20s – PEPE (not to be confused with the ERC-20 version), BANK, MEME, VMPX, and PIZA). However, the result of all of this has been the greatest swell in activity on-chain in Bitcoin that we’ve seen in years.

What’s the problem? Part of the problem is the block size, though ordinals are still forced to adhere to sufficiently strict limits that this hasn’t been a huge problem in practice; average 1-day block size has so far peaked at 2.5MB, having previously generally done so at 1.5MB. It cannot be overstated how much the block size issue is a sore spot for the Bitcoin community because of the ‘block wars’ prior to 2017, and the concerns over higher limits on block size leading into unlimited block size and full centralisation in fact being largely proven right by the trajectory of Bitcoin forks post-2017, so that does need to be borne in mind here.

The bigger problem is that the demands being put on the network and on the mempool (i.e. the amount of data, including transactions, waiting to be written to the chain) are economically catastrophic. The mempool is congested to a degree currently only seen at the tops of bull markets:

Via Jochen-hoenicke.de.

What this means practically is that tens of thousands of transactions are stuck waiting for confirmation, and that if one wants to initiate a transaction and have it go through in a reasonable length of time, they will have to pay a significant fee – as mentioned, in the range of $20-$50 at peak times, and around $8 off-peak, after it having been closer to a dollar or less consistently over the last couple of years.

While high fees may not be a disaster for ‘digital gold’ or the ‘currency of last resort’, they are for almost anything else, and the situation has become embarrassing for advocates, particularly because we have seen some degree of reflexivity on the ‘coffee to gold’ payment angle back fo coffee over the last year or two; one of the biggest PR wins for Bitcoin has been El Salvador under Nayib Bukele full-throatedly backing the asset. Unsurprisingly, the current level of transaction fees make the asset unusable in a country where the average salary is around $400; a particularly cynical person might make the case that this would seem to speak more to a failure of the Lightning Network and that high base layer fees at some point are in the long run unavoidable, but it has ruffled feathers whatever the case.

This is all setting the stage for one thing – conflict. As in 2014, Bitcoin core developers are overwhelmingly extremely hostile to ordinals and associated development, and discussions have begun publicly about taking action to essentially excise them from the chain on the grounds that they constitute a denial-of-service attack rather than legitimate usage.

In our view, this would be an absolute disaster for all involved, with unknowable repercussions stemming from it, and needs to be monitored carefully. While we sit far from having full faith in ordinal development and developers, and concede that the transaction fees are ‘not a good look’ to borrow a phrase, at the same time, there has been increased discussion over the long-term viability of Bitcoin without an overall increase in fee revenue to miners (the ‘security budget’ issue); the revival of chain-interactive development on Bitcoin is an opportunity, not a challenge.

For now, the show goes on. The fate of the BRC-20s – which nominally have a combined market cap of over $1bn according to some accounts – will be important to watch over the coming few weeks. We tend to think that their emergence is another indication that we are likely closer to the end of the recent mini-bull than the beginning of it; but, even if this feels closer to 2019 than 2021 or 2017 re: an upsurge in sentiment and genuine enthusiasm, the increase in certain usage metrics cannot be ignored entirely.

Leave a Reply

Your email address will not be published. Required fields are marked *