Polkadot Roundup MMXXIV
Dive into Polkadot Roundup MMXXIV for an in-depth look at 2024 milestones, including high-performance rollups, decentralized governance, and exciting expansions. Discover how Polkadot is evolving into a next-level web3 platform—plus a sneak peek at Polkadot 2, Polkadot 3, and the JAM protocol supercomputer.
It’s that time again… I put away my text editor for a day or five and crack open Medium. The nights are cold, the days are short, the fire is lit and it’s time for a retrospective on our ecosystem. Where are we in our efforts? What have we achieved over the last 12 months? And what’s happening next? Fasten your seatbelts and get ready for a long one this year as we’ve been busy and we’re hungry for more.
Hello… Christmas smoothie…
Before we begin the round-up, let’s put this year into perspective. Polkadot’s journey began as a white paper (2016) and crowd sale (2017). In quite a real sense this commanded much of its underlying direction and product ethos in the subsequent years; delivery on the white paper’s product vision commanded our focus. For several reasons and in several ways, 2023/2024 was a watershed period and the period in which this very concretely shifted.
This year thus marks an important transition of Polkadot, from delivering on the narrow product scope expressed in a white paper, to optimising, stabilising and finessing the product, in line with the needs of our market and helped along with data collection and analysis from Parity’s DOT Lake and Token Terminal; and finally to setting out a framework on developing a much broader and more relevant Web3 service platform.
I’ll do the buttons, you do the answers
Polkadot’s main value proposition is providing oodles of high-quality well-connected block-space, known in Polkadot as coretime. The main way this is used at present is by hosting high-performance blockchains, hitherto called parachains. And over the past year or so we have finally seen a first taste of such projects making use of Polkadot’s extreme levels of performance by importing and serving large non-crypto user-bases. One of the more visible examples of this is Mythical Games launching the Mythos chain providing, amongst other services, in-game assets as tradable on-chain NFTs to major mainstream games.
Mythos hosts the assets of several games, not least NFL Rivals, and brings with it nearly a million active wallets and a user-base of five million gamers. Even taken alone, it is second in the entire industry for NFT transactions and demonstrates the ability of Polkadot’s technology to effortlessly rise to satisfy needed activity. Expect more such usage to follow as Mythical releases the insanely popular FIFA Rivals next year and, true to the permissionless ethos, is opening their chain for games industry-wide to utilise.
Mythical does not stand alone in its conviction seeing Polkadot as the Web3 platform of choice for high-performance, high-resilience gaming uses. DOTplay launched this year as a resource to help teams building gaming-related projects on Polkadot get ahead, and one of the first projects is has helped is Ajuna, who bring Polkadot integration into the most popular game framework, Unity.
Indeed, transaction volume across parachains is up around 300% over the course of the year, increasing from a little over 10M transactions a month to almost 40M in November. Event volume, an alternative metric of on-chain paints a similar picture. Neuroweb (by Origin Trail) has led the pack of late with 200M events and 14M transactions last month alone, showcasing Polkadot’s ability to manage the volume needed for a global supply-chain tracker.
Similarly, Frequency (a key component for Frank McCourt’s Project Liberty initiative), went live with 100M users this year and now processes around 10M transactions/month. Phala, Litentry and Mythos all contributed substantial portions of Polkadot’s overall transaction volume as it has surged to 40M/month.
As a recent Electric Capital report pointed out, the Polkadot SDK still sits in the top three blockchain tech stacks measured by apparent open-source developer activity. Of course this doesn’t take into account all of the Polkadot ecosystem development such as EVM-based coding on Moonbeam and Phala, or closed-source JAM implementations from which we might reasonably expect the additional 100+ developers to raise Polkadot’s ranking somewhat.
You can’t trust people
In our mission to deliver on a truly Web3 vision, decentralisation has always been a key mantra for Polkadot as a means of delivering resilience. While we have had an impressive Nakamoto coefficient since launch, 2024 saw this improve from 93 at the start of the year to 132 now, making our network the first major to achieve a rating over 100 and making it four times the level of decentralisation as the next ranked blockchain. For reference, most PoS blockchains do not break 20 (Solana’s actually fell from 32 to just 18), Polygon and Bitcoin each can be broken by just four collaborating entities and Ethereum sits at the bottom of the pack at a meagre two.
Perhaps it should not come as too much of a surprise that some crypto-projects, and especially those indebted to venture capital funds, care more about scaling up (not out), chasing hype and hawking coins and thus consider long-term resilience, governance and decentralisation as largely superfluous.
This last year brought with it some important changes for Parity as by late 2023 the Web3 Foundation made a strategic pivot to reverse the accumulation of functions and, accordingly, personnel. Instead it would take a more lean and focussed model emphasising the delivery of core technology, important tooling and metrics, leaving other functions such as content, business development and applications to the community. The “decentralisation” initiative was born in order to enact this strategy with minimal disruption to the project and give those community members who could step up maximal opportunity. Web3 Foundation’s $60m Decentralised Futures programme began in earnest during the first half of 2024.
Numerous individuals stepped up within the ecosystem and we now see a growing number of independent, successful teams delivering valuable products and services within the Polkadot ecosystem. One notable example is WebZero, an events group who successfully took on the Sub0 developer conference brand and launched a new-look, new-feel and extremely compelling Reset conference in Bangkok during November.
Sub0 Reset was far from the only impressive event of the year; we began with another well-attended Sub0 (also in Bangkok). The flagship Decoded event returned this summer in Brussels and went off with a bang. 2024 was also the year when Web3 Summit returned after a five year hiatus and it did not fail to disappoint, retaining its former — somewhat rebellious — vibe not to mention its venerable venue, Funkhaus of Berlin.
This was also the year of the first Gray Paper Tour, a university-based lecture series given by me on the JAM protocol and in particular its formal spec, the Gray Paper. Eight venues across four continents were covered including a marathon seven-hour lecture done in conjunction with the Singapore Polkadot Blockchain Academy at the NUS.
Indeed, another successful Parity spinout which has been completed this year is the Polkadot Blockchain Academy, our academic-grade, in-person, ground-up, taught course by some of the best and brightest of the Polkadot project. The PBA course has, over the year, successfully been refocussing somewhat away from the more specialised Polkadot implementation specifics and into the realm of smart contracts and governance, working increasingly with the BlockchainGov collective. Now wholly an independent project it has launched the PBA-X under its own wings in order to bring the excellent teachings of the PBA proper online and offer it to a much broader audience.
People like Coldplay…
Parity was not the only demonstration of decentralisation: Polkadot’s governance system continued its status as the largest and most sophisticated DAO ever created. Rather than unclear decision-making systems or, worse, defending centralised decision making and development, we can take pride in being the only major network to make all of our protocol decisions in a transparent, accessible and resilient manner. (And consistently delivering major updates to boot.)
Indeed, beyond all of the day-to-day and month-to-month upgrade plans passed through governance, 2024 was the year in which DOT holders could direct core developers in a much more direct and proactive manner with the newly created Wish For Change governance track. One remarkable proposal was referendum #682, which officially named JAM as the technology to replace the Relay-chain in due course and which passed with 99.998% (and all but four accounts) voting in favour.
Managed by our homegrown OpenGov, around $120M in assets are under the control of the assembled DOT stake holders with about the same amount spent over the year. A staggering 1,350 referenda have been initiated under OpenGov and with an emphasis on vote delegation we have seen an increase both in the average amount of DOT used in making a decision over a proposal and in the average conviction. More people are locking up more of their DOT to have their voice heard in our network’s decisions. You can find more information in the year-end report compiled by Parity’s DotLake team.
One of the biggest democratic events in the year on Polkadot was the DOT issuance vote. At launch, DOT had a 10% year-on-year token issuance rate. This funded rewards for stakers (who are the network’s backbone of security) with a variable amount also being sent to the treasury to fund activity in the ecosystem which token holders consider valuable. This year saw this amount reduce to a fixed DOT issuance rate of 120M per year, thus reducing 2025's DOT issuance from over 150M DOT. Around 100M DOT will be used for staking rewards, with the rest funding the annual treasury budget. With the issuance rate now being fixed, the effective base inflation will reduce as a proportion of the overall network issuance over time: DOT is now disinflationary.
The Polkadot (Core) Fellowship, the main body of expertise over the Polkadot protocol is now pushing 100 members, doubling since it was launched mid last year. Collectives such as the Fellowship now have access to their own treasury to pay for projects and expenses and over the course of this year we saw one of the things I was personally most excited about finally coming to fruition: multi-asset treasuries, in essence the ability to hold foreign reserves as part of the network’s treasury (or indeed any sub treasury). The Polkadot Network now holds not only DOT but significant amounts of USDC and USDT (not to mention DED😂), routinely uses Hydration to trade currencies autonomously, and typically serves treasury spending proposals in stable-coins. We are now ready to have a fully autonomous Polkadot sovereign wealth fund. To boot, the Fellowship now maintains an autonomous monthly payroll with salaries paid out in USDT.
We have seen the first followup “fellowship”, aimed at the Polkadot ambassadors, coming to life. With its own manifesto collaboratively edited and openly discussed, it could perhaps be a template for future such governance bodies. And already new bodies are being mooted: a user-interface fellowship and a secretaries fellowship. Combining these with their own treasury, budget and payroll, we can truly begin to see the seeds of a sophisticated, transparent, autonomous civil-service.
Anything’s possible
Like a phantasm in an ethereal plane, it can sometimes be rather inconvenient to be an autonomous phenomenon existing in the decentralised world: interacting with the regular, meat-and-bones, centralised world is fraught with difficulty. Mundane things like e.g. paying some rent or making a service subscription become difficult if you don’t cut corners and have your founding company do it for you. Because of this, 2024 saw the creation of a new kind of entity: the PCF.
Incorporated in the Cayman Islands, with its activities directed purely by DOT holders via OpenGov, the PCF is a centralised-world adapter for OpenGov to execute tasks such as signing commercial contracts, making fiat payments, enforcing intellectual property, and contracting third-party service providers like consultants. Unlike, e.g. the Web3 Foundation, PCF has no assets, shareholders, members, trustees, or beneficiaries whose interests or opinions may not necessarily represent those of the token holders.
The PCF thus need not be a one-of-a-kind entity. Parity’s efforts in researching and designing the PCF’s structure form a template for other actually decentralised projects and will be made available to others in the ecosystem should they also have need for an adapter into the centralised world.
Knitting like an Electric Nan
While certain other networks attempt to introduce ever more stratospheric node requirements and complex optimisations in order to nudge up their (somewhat silly) TPS benchmarks, bringing their network decentralisation and resilience further into question, Polkadot (well, specifically, Kusama) demonstrated true performance of a network by scaling out. By distributing workload across a network of consumer-level node hardware, we are not limited by the computation which can be done on a single machine, nor how fast each of the validators can share the results of all of the computation.
Organised by Amforc, paid by the treasury and named The Spammening, this demonstration happened live on Kusama in December. Putting on quite the show (kudos Jay!), it showcased the extreme levels of performance our technology can provide in the wild on the Kusama network. In a first for a live, value-bearing, actually decentralised network we achieved a sustained rate in excess of 100,000 transactions per second (actually 143,343), and only using around a quarter of Kusama’s cores, its computation resources. Optimisations remain, especially in networking and as they arrive we’ll be revisiting (and lifting!) this figure, but our point is clear: even if you “only” want performance, Polkadot is your best bet. (You get top-of-the-class resilience, multi-chain connectivity and the modular blockchain flexibility as a sweetener!)
Plumbing’s just Lego, innit?
Over the last twelve months we saw Polkadot’s cross-chain interoperability increase manifold. Chains hosted by Polkadot can now interact trustlessly and arbitrarily with Kusama and other Substrate solo chains (via the Substrate Bridge), Ethereum (via the Snowbridge), each other and several other industry networks (via the Hyperbridge of Seun Lanlege, II Dan). By allowing XCM to be sent between these networks, Polkadot is finally realising the ambition set out in its vision paper. Unlike other bridges in which the assets transmitted over them are in the hands of a few (often unaccountable) interests, or those which support only specific hard-coded and tightly-scoped functionality, our bridges are all trustless and programmable.
In practice this means that the connectivity enjoyed by blockchains hosted on Polkadot with Kusama, Ethereum and others inherit the rich programmability present in XCM. Token transfers are merely the tip of the iceberg: exchanges can be executed (without the need for funding temporary accounts), data published, properties queried, fees paid, messages forwarded and smart-contracts called.
XCM itself has gone through a major revision this year and is guided less by feature delivery and more by addressing the acute pain-points of the Polkadot user-base. Message relaying via the system-chains and fee-transparency have been prioritised to ensure a far more coherent and robust cross-chain interaction experience. We are already seeing the fruits of this in action as our wallet ecosystem provides impressively slick experiences increasingly allowing the user the luxury of forgetting that they are transacting across different chains.
Announced only just in time to make it into this writeup, Harbour Capital have collaborated with Polkadot’s Velocity Labs to release their Magic Ramp service on Polkadot. This provides a top-of-the-line experience for anyone wanting to move funds between the banking system and Polkadot, allowing direct trading between USDC and Euros in a bank account. At present active only in Europe, expect a wider rollout in 2025.
This combination of trustlessness, programability, performance and newfound connectivity is powerful; with increasingly effective user-interfaces and service-chains like Polkadot Hub, Hydration and Polkadex putting Polkadot’s immense power to use, we have a unique opportunity to capitalise on being a cross-chain meeting point during 2025.
That app is really moreish
Indeed, 2024 has seen the overall user experience within Polkadot improve quite dramatically. Moving funds and NFTs between chains, managing multisigs and proxies, engaging with governance via delegation are becoming trivial. Increasingly we see wallets breaking away from exposing the multi-chain nature of the Polkadot ecosystem to their users and instead coalescing token balances and NFTs across the ecosystem to deliver an experience much more fitting with a coherent ecosystem which just so happens to be distributed across different chains. The Subwallet, Talisman and Nova teams have been continuously improving, with the latter even generalising over exchange systems, routing funds over to the right chain to get the best deal for the user, a great marriage of high-quality UI, XCM and parachain logic.
One initiative which was another recipient of the Decentralised Futures programme and which has been teased for several weeks now within the community is the Polkadot App. As mentioned by Björn at Decoded, the app is designed very much to ease on-boarding into Polkadot and avoids all complicating factors, bringing a streamlined user experience laser-focused on delivering the features relevant to mass-market and disregarding all else.
The app functions as a simple and easy non-custodial wallet with support for DOT, KSM and USDT/C. It saves the user from mnemonic-hell by using native iOS/Android functionality to protect and backup the secret keys. It integrates a username registrar ensuring that everyone can get an easy-to-remember name to which funds may be sent, as well as support for one-tap staking via Polkadot’s staking pools. Transfer fees are insignificant and can be paid in stable-coins which, combined with the on/off-ramps (like the Magic Ramp), makes the app a highly agile, decentralised payment system.
Most impressively, it bundles Polkadot Pay as such a ramp, allowing for wallet funds to be used to pay for goods and services at a million shops and stores in the United States and as a bonus, you not only don’t get shafted with credit card charges as with other crypto-payment schemes, but you actually get bonuses whenever you spend!
Easy like a Sonntag Morgen
Over 2024 we have seen the transitioning from Polkadot’s legacy technologies accelerating. Usage of Polkadot’s excellent light-client Smoldot is becoming more common and the PolkadotAPI and Substrate Connect projects are gaining traction within the ecosystem. As such, it makes increasing sense for new developments like Kheopswap to use this new and properly resilient stack over the older and less light-client-friendly Polkadot.js. Kudos to those behind these projects for their success as independents within our ecosystem.
For Parity’s part, two important initiatives have been underway in 2024 to help bring Polkadot SDK blockchain deployment closer in ease to that of smart-contracts. The Omninode initiative, released as part of the Polkadot SDK’s December update, introduces a single node binary capable of syncing and collating pretty much any network built on Polkadot SDK. This means one fewer thing to be concerned about for development, deployment and maintenance for all teams with a chain hosted on Polkadot and opens the door to chain-hosting-as-a-service such as Zeeve’s Perfuse system, potentially making the deployment and upkeep of a chain comparably simple to that of a smart-contract!
While the Omninode helps minimise effort in dealing with node-level programming, maintaining the on-chain business logic (the so called runtime) can still be a daunting task. While Polkadot SDK’s incredible pace of development lets it lead the way for a modular blockchain SDK (so much so that we see other prominent industry teams building with it like Polygon’s Avail and Cardano’s Midnight), a price has been paid by the volatility of the APIs used. This has caused substantial work for downstream builders and a modicum of stability has long been desired. As of 2024, the Polkadot Stabilisation Initiative has successfully dealt with this by instituting a strict schedule for code releases of the runtime elements of the Polkadot SDK. Major changes (such as the new Transaction Extension API allowing ZK-proofs to be used in place of transaction signatures) may only be made during a major version release which itself is limited to a quarterly cycle. That means teams building a cutting edge Polkadot SDK chain get three months of development peace and need concern themselves with code upgrades only once per season. And for teams willing to forgo the latest feature development, we now provide long-term support releases which remain secure and compatible for nine months.
Polkadot 2: Just typed up a little contract
In the last year a major evolution of the original Polkadot platform has taken place. It has largely been delivered, too. While parachains, Polkadot’s secure-fast rollups, were a key element of the original Polkadot product proposition (and by virtue of slot auctions, the utility of the DOT token), the real engineering value was applicable well beyond this product. What we had spent three years engineering was the basis of a high performance “world computer” and a first step toward our wider goal, still explicitly stated in the 2016 Polkadot white paper, of delivering a platform suitable for bringing Web2 applications into the Web3 world of resilience, trustlessness and, by extension, truthfulness. In order to distinguish it from the Polkadot product described in the original vision paper, we as a community decided to give a name to this evolved vision and thus Polkadot 2 was born.
And this new understanding was born not of a single person, nor even a company, but rather as the result of many voices within our community, helping showcase Polkadot’s immense resilience and decentralisation. Polkadot 2 is a combination of deep upgrades to the technology core, pivotal new features, and a new conceptual framework with which we are better able to understand Polkadot’s value. Under it, Polkadot’s success is broken down into two main goals: on the one hand improving its community, and on the other maximising the utility of Polkadot’s raw product, core time. Each of the upgrades, features and conceptualisation are in pursuit of these goals.
Be in two bands at once
The first of the “big three” upgrades rolled out in 2024 is Asynchronous Backing, an optimisation across the Polkadot stack which brings pipelining into the process of securing a new parachain block. This allows two distinct functions of the Polkadot security system, namely correctness-guaranteeing and data-availability, to happen simultaneously and thus shortens the standard block-period from 12 seconds to six seconds. Along with this change, we became able to increase the time used in the correctness-guaranteeing stage by around 400%, giving an overall throughput increase of around 10x for any blockchain hosted by Polkadot 2 compared to Polkadot 1.
This dramatically improves the transactive capacity of blockchains hosted by Polkadot, effectively driving down the cost per transaction within the ecosystem and making Polkadot a more attractive target on which to build and deploy.
The secret ingredient is time
The second of the upgrades, known as Agile Coretime, dispensed with Polkadot 1’s deposit, lease and crowd-loan system. In its place we now have a simple monthly auction for each of its 45 cores allowing anyone to purchase, permissionlessly from Polkadot, one month on a Polkadot core at a time. This coretime can then be chopped up, interleaved and transferred from owner to owner, even allowing many chains to share it for their hosting trustlessly. Specialised exchanges and integrations already exist on Polkadot for trading and procuring coretime like RegionX, Lastic and Perfuse.
The change to Agile Coretime actually alters how a DOT token can be used; gone are crowdloans and deposit-auctions, and in their place is a highly responsive purchasing market with the end goal to give maximum flexibility to consumers, while keeping costs predictable. The gravity of this alteration should act as a stark demonstration that Polkadot, despite being highly decentralised, has both the will and the ability to adapt its economics to a changing environment.
Men with Ven
With Polkadot’s ecosystem chains now free from the “one parachain, one slot, one core” system of old, this year has opened up far more interesting uses of Polkadot’s coretime. Short-lived chains are possible. Chains which progress on demand are possible. Chains which share their coretime with others are possible. But perhaps the most interesting avenue is multi-core scaling, which will be pioneered by the new Polkadot Hub: high-frequency, high-performance chains which utilize multiple cores in order to multiply up their volume and slash their latency.
By utilizing three cores, the Polkadot Hub will go three times the speed of the Polkadot Relay-chain, confirming a full block every two seconds instead of the usual six. This triples the usual data and computation bandwidth which would normally available to it, and provides an unprecedented upgrade path for teams who care about future-proofing their solution. Already even higher frequencies are being experimented with as Basti (Fellow, VI Dan) tested the first 500ms Substrate chain, a full 12 times more blocks per second than current chains.
As if this weren’t enough, 2024 saw the development of elastic scaling (to be deployed in 2025) which takes this paradigm one step further, and lets chains run cheaply at low speed in times of lesser usage, and scale up dynamically in periods where usage peaks, procuring coretime — and confirming blocks more frequently — as needed.
No logo on the foam
The technical alterations of Polkadot 2 clearly help Polkadot maximise its utility. But perhaps as important as this technical work is the conceptual work which allows both us, and anyone coming to Polkadot, to better understand what it is that Polkadot does.
The new conceptual framework proposed and mooted this year is a brainchild of Shawn Tabrizi (Fellow, IV Dan). Polkadot has traditionally eschewed the label of cryptocurrency or even blockchain, preferring to project itself on a multi-chain ecosystem narrative. Under the original, parachain-centric, product proposal this made a lot of sense. However, defining Polkadot purely in such terms has a blinkering effect, drawing attention away from what lies both above and below the parachain level. It clouds strategy, making certain value-creating paths difficult to see or explain.
The Hub/Cloud duality is an alternative narrative to that of the “parachain community” and it helps draw both intellectual and market attention to emphasise Polkadot’s true, underlying value propositions. The metaphor can be used as a conceptual framework to better understand how all of our work, features, ideas and developments, both past, present and future can fit into Polkadot, and ultimately make Polkadot more relevant to the market and the world.
The metaphor partitions Polkadot into two symbiotic product segments: one is centred around the unstoppable compute resources generated by the Polkadot validators and its protocol. This is called the Polkadot Cloud and powers the original Polkadot Parachain product. The other, the Polkadot Hub, is characterised by the services on the various system chains (which sit on top of the original product). Though symbiotic, both products are independently useful and this utility is paid for on both by DOT.
Concretely, the Polkadot Cloud is presently coretime and everything which it can be used for. Hosting Polkadot SDK chains (i.e. parachains) presents the first compelling use-case for coretime, though Parity has plans to launch others during 2025. As the Polkadot Bulletin chain and Small-Statements Hub come online, its Cloud offerings will expand into data distribution. The JAM protocol also fits very nicely into this narrative since its coretime is categorically more useful than that of the Relay-chain, immediately bolstering our Cloud offering with all of the Services sitting on the JAM Supercomputer.
On the other hand, Polkadot Hub is in effect the meeting point for community and direct interaction. It provides a highly coherent, generally synchronous place for people, teams and their logic to collaborate and compose. Centred in Hub is the Plaza initiative (proposed earlier this year by Fellow Rob Habermeier, VI Dan) an advanced, Polkadot-native permissionless, cheap and fast smart-contract system. All of the various functionality across the official Polkadot chains will be made conveniently available in the Hub including the staking system, governance & collectives, identity & personhood, all tokens & NFTs as well as all bridges. Much of it will be synchronously composable with the smart-contracts.
Polkadot Hub thus sees a unification and optimisation of experience across Polkadot, eliminating the fragmentation caused by exposing the different system chains to our user base, long a point of friction on Polkadot. The Hub is powered by three Cloud cores, giving a two second block frequency and thus three times as much grunt as a single-core chain, or about thirty times as much as last year’s parachains. To lift performance even more, the smart-contracts on Hub are PVM-based and powered by Fellow Jan Bujak’s blistering-fast PolkaVM, which achieves up to around 45% of native execution speed and orders of magnitude more than EVM interpreters.
While Hub smart-contracts are expressed in PVM code (a derivative of the venerable RISC-V ISA), they need not be written in any particular language. Indeed, Hub is 100% compatible with the regular Ethereum developer tooling, including the Solidity language, deployment systems and Metamask. Beyond that, Hub contracts can also be written in Rust, ink!, C, C++ or, well, pretty much any language with a compiler. With PolkaVM, Hub’s bridges and Cloud’s security you can get ludicrous Ethereum performance increases from Polkadot without leaving the Ethereum world.
Hub is intended to rival even the fastest pseudo-decentralised, synchronous chains in the industry and thus performance is critical: networking stack improvements have already been made this year ready for deployment in the coming year. Thrum’s NOMT, a high-performance Merkle trie system introduced this year by Fellows Sergei (V Dan) and Rob provides an important path of further optimisation to be taken by Hub, allowing performance increases of an order of magnitude or more. And experiments have been underway this year for faster block-times further upping the multiplier.
The Westend test-net launched this month with an initial testing version of Hub chain, complete with Ethereum-compatible smart-contract functionality. Expect rollouts in 2025.
It’s the Heart of Darkness
With much of our social interaction patterns now firmly rooted in the digital realm and often textual in nature, Generative Artificial Intelligence presents a truly unique danger to the fabric of the free world. Web2-era proof-of-personhood schemes such as captchas, email/SMS verification and even government IDs are increasingly falling prey to GenAI and malevolent powers. In one recent election alone, at least 66,000 individual accounts were found to be controlled by a single interest, together with 10M followers. Despite the old adage “truth is not determined by majority vote”, modern (social) media provide little other means of discerning truth beyond popularity and an apparent groundswell of support for some fiction can make it appear credible to enough people to manifestly distort the democratic path.
In the civilisation we believe we have, AI resistance will increasingly be needed to avoid yet another wave of anti-democratisation and power-consolidation by a few entities. Helpful idiots, greedy corporates and ego-maniacs are forming a concerning union. We cannot look anywhere other than our assembled selves as an antidote.
In order to give people a voice honest and true, Web3 technology is utterly indispensable. However, we need more than blockchain, even one as resilient and performant as Polkadot: we need to algorithmitise the concept of personhood. And furthermore we must do it without fundamentally compromising Web3 values, not least individual privacy, any further.
Thus begins the vision for our Proof-of-(Polite-)Personhood, announced earlier this year. The compute-heavy logic is hosted on Polkadot but takes advantage of Polkadot’s extreme trust-free connectivity provided by Snowbridge and Hyperbridge to make the service accessible across Web3 and even into Web2.
To be rolled out in (at least) three stages, known as DIM1, DIM2 and DIM3 (DIM means Digital Individuality Mechanism), Polkadot’s Po(P)P is a foundational and uncompromising Web3 individuality system. It aims to avoids any kind of centralisation or privilege, uses the latest ZK technology to protect your privacy and is open and transparent for anyone to audit. Expect to see it launch over 2025.
Polkadot 3: Can you stop Jamming?
While our new Hub and Cloud paradigm underpins Polkadot 2, it also provides the conceptual framework to understand the next chapter in Polkadot’s journey, a transition from the Relay-chain to the JAM protocol. JAM forms the basis of the next generation of Polkadot Cloud services.
2024 saw the initial publication of the JAM protocol, as the Gray Paper, exactly a decade from the publication of Ethereum’s Yellow Paper. Like Ethereum’s Yellow Paper, JAM is a protocol which yields a permissionless “world computer”, but unlike that of Ethereum, it is not only resilient but highly performant. Projected to have hundreds of computation cores and the best part of a gigabyte per second of data I/O, it aims to be the first real Web3 supercomputer.
The Gray Paper has now gone through numerous revisions and is reaching stability, now at 0.5.3 with 0.5 series to be finished soon into new year and a target date for 1.0 in the first half of next year. Unlike the Polkadot Relay chain which has only one primary (and two partially functional secondary) implementations, 35 teams are each building implementations of JAM using 15 programming languages. Their code is presently unpublished, but expect it to become open next year. They are spurred by the JAM Prize fund, the biggest programming prize in history, with a value well into the tens of millions of USD.
Like Polkadot at present, JAM is not merely decentralised, it is distributed. We scale out, with performance coming as an emergent effect of having many nodes. Unlike the direct effects of centralised and highly synchronous systems, engineering for emergent effects is categorically more difficult. As such systems grow, emergent effects become deeper and more sophisticated; unexpected emergent behaviour (like performance degradation) becomes very difficult to theorise about, diagnose and fix. Optimisation becomes hard.
So this year we embarked on an industry first. We are building a small supercomputer with sixteen thousand AMD Threadripper cores, totalling 16 GB of L2 cache, 32 TB of RAM and 20 PB of storage. With this hardware we are able to deploy a full-scale JAM network and, along with debugging and analysis software, poke and prod our protocol and implementations of it to get optimal emergent behaviour. This is all part of the effort to ensure our theoretical predictions actually match the practical realities of the globally scaled network.
This project, the Testing, Optimisation, Analysis and Scale-Trial Experimentation Rig (or JAM TOASTER for short 🥲) is currently half-way through the first of a two phase build out with phase one expected to be complete early into new year and phase 2 scheduled for the second half of 2025. As a little bonus, it will live in the basement of the Polkadot Palace and use its Jacuzzi as the heat sink. Mmmmm.. toasty.
So now we’re working
What about building on JAM? Our first target demonstration, DOOM-on-JAM, is as you might expect to execute the regular DOOM game on JAM, under consensus, outputting the video-frames as images into our giant data lake. To do this, we need a service which is able to host a boundless virtual machine i.e. one which can execute any code you’d care to compile and without annoying limitations like block gas limits or a requirement to use special primitives, languages or programming techniques. This is a first in blockchain and makes it actually Turing-complete. This JAM service, codenamed Alpha, is already under heavy development.
Another very important project to bring in 2025 is essentially the transitioning of logic which currently sits embedded in the Polkadot Relay-chain’s runtime and into an individually upgradable JAM service. This paves the path for the Polkadot JAM chain to replace the current Polkadot Relay chain as the hosting apparatus for Polkadot’s blockchain ecosystem. In time the Polkadot Blockchain Hosting Service (codename Beta) will be extended to include new JAM-specific functionality for the blockchains, not least Accords (allowing two sovereign blockchains to interact trustlessly) and dynamic metering (to avoid benchmarks in most circumstances).
In order to demonstrate JAM’s unique transactive capacity, not to mention get a solid use-case to help optimize the JAM protocol, we will be creating a simple payments system on JAM. Codenamed Mu, it will prototype a low-latency high-throughput multi-currency payments system with the target of breaking a sustained rate of one million of transactions per second across JAM’s cores.
Finally, project Lambda combines elements of Alpha and Mu together into a highly scalable actor-based system with dynamic state partitioning to highlight and prove JAM’s highly-scalable mostly-coherent computation model. 2025 will see Parity and others embark on the road of designing and delivering these services.
A powerful sense of dread
In the recent newscycle and weighing on the collective consciousness are the improvements being made with quantum computation. Google was the latest company to announce an important milestone with their project Willow and its error-proofing capabilities. While not anywhere close to being actually useful for anything, it does represent the solution of one important road-blocker on the path to building a useful quantum computing device.
And one of the first “useful” things (in a staunchly code-is-law sense, anyway) that such a quantum computing device might be able to do is to recover the secret key from a public key. (And not necessarily for a key whose moral owner is the controller of the device.) This obviously breaks much of the implied service of Bitcoin, not to mention Ethereum, Polkadot and just about every crypto out there.
Thankfully, JAM being transactionless and relying primarily on PVM-based authorisers and game-theoretic machinery makes it something of a doddle to transition to being a fully quantum-resistant protocol. Expect 2025 to include in it a report from our researchers on the precise protocol changes needed to move to a purely quantum resistant protocol scheme.
That’s total fucking marmalade
We can think of JAM as the “crypto-economic silicon” which takes the raw materials of economic value and real-world machines and transforms them into a single supercomputer which exists not under someone’s control in the real-world but under no-one’s control in the Internet world.
As long as there’s one JAM network secured by the DOT token, then there’s only one such supercomputer and we can just call it The Polkadot Supercomputer. This is the basis of Polkadot 3 and if you understand anything about JAM it’s probably as much of the vision as you know.
The expected performance of The Polkadot Supercomputer, (i.e. the service delivered by a single instance of the JAM protocol), is ludicrous by today’s industry standards and impressive by tomorrow’s. But as with all decentralised computer systems thus far it still falls short when compared against the level of valuable transactions done across institutions and Web2 the world over. If Web3 is to be a viable direction for The World, far more grunt is needed, and this additional grunt must not entirely nobble the prospect of state cohesion nor must it come at the cost of resilience or generality. (Obviously from this thesis it should become clear that purely synchronous fully-coherent system designs like Solana are a Web3 dead-end as they can only scale up, not out. Their logical end is a single centralised supercomputer.)
In short, we need a viable long-term (5-year) plan to allow Polkadot services to be used not only on a single JAM-powered Polkadot Supercomputer, but many. Very many.
To paraphrase Delenn (of Babylon 5) talking of starship the White Star, the Polkadot Supercomputer was never intended to be one of a kind. It will only be the first. The Polkadot Cloud is intended to have many such JAM Supercomputers, forming a JAM Grid. While the Grid is all secured under the same DOT-staked security umbrella, each JAM hosts its own services. Just as a JAM forges a single supercomputer under no-one’s control, the JAM Grid is the crypto-economic glue which transforms similar resources into a tightly knit cluster of supercomputers… also under no-one’s control, in the Internet world. And it is this which, in time, will constitute much of the Polkadot Cloud.
JAM Services are being designed with a split between the asynchronous, mostly-stateless Refinement part and the highly synchronous, fully-coherent Accumulate part. This design ensures services are expressed in such a way that they can be scaled out across the computation cores of a JAM supercomputer. It so happens that this design also plays well with the prospect of scaling out not just over cores within the same JAM supercomputer, but also beyond to use cores in other JAM supercomputers on the Grid.
A few open questions remain: just how transparent will it be to scale a service beyond a single JAM? How important is a JAM’s position in the lattice? Are the data-availability guarantees the same throughout? These will be pondered over the coming year as 1.0 of JAM takes shape. And how much power could the Polkadot Cloud have with a JAM Grid? A rough ballpark could be a size-10 lattice of JAMs, giving an aggregate DA of over an Exabyte, data bandwidth of over 600 GB/s and compute power of maybe 1 quadrillion EVM-equivalent gas/second.
In terms of raw compute, that would be enough to give every individual on the planet several times as much block-space as the whole of the current Ethereum L1 chain; enough to have a good stab at processing not merely everyone’s signed transactions but also everyone’s bots, actors and devices, a necessary requirement if we are to live in a digital world not under control of delegate interests.
Alright. Laters. I’m sprinting to Londis
And so begins Polkadot’s 2025. As with all such write-ups, much of note could not be included: For this, I give my abject apologies to all those projects and people in our midst whose tireless efforts help us deliver a true Web3 future.
There is much on Polkadot’s horizon. The App, Hub, Citizenship, DAO expansion, JAM and deep, trustless and effortless Ethereum integration. So much to be excited about and so much to be building with. So, take a hard-earned rest, go watch something nostalgic (Die Hard and Home Alone would be my personal picks), consider the above and come back in the new year to buidl buidl buidl and never look back.
Happy holidays 🎄