The Future of Tokenization: Beyond the Hype and Into Real-World Application
We’re Only at the Beginning of a Massive Market Transformation
The conversation around tokenization – the process of representing traditional assets on blockchain platforms like Ethereum – is just getting started, and the potential market is staggering. According to Min Lin, managing director of global expansion at Ondo, we’re looking at the U.S. Treasuries market alone, which is valued at an eye-watering $29 trillion. When you add global equities into the mix, that number balloons to approximately $127 trillion, with the United States accounting for $69 trillion of that total. These aren’t just abstract numbers – they represent real assets that could potentially be transformed through blockchain technology. Lin shared these insights at CoinDesk’s Consensus Hong Kong conference, highlighting that despite the enormous opportunity, the industry is still in its infancy when it comes to actually tokenizing these traditional financial assets. The technology exists, the infrastructure is being built, and traditional finance is showing genuine interest in exploring tokenized real-world assets (RWAs). However, the gap between the theoretical potential and practical implementation remains significant, and bridging that gap requires more than just technological capability – it demands careful strategy, regulatory compliance, and genuine utility.
The Critical Challenge: Matching Hype with Real Utility
While the numbers are dizzying and there’s undeniable interest from traditional finance in exploring tokenized assets, Graham Ferguson, head of ecosystem at Securitize, emphasized that the industry needs to exercise care and attention when it comes to matching the hype with real-world utility. Ferguson candidly admitted that the industry hasn’t historically done a great job of ascribing genuine utility to these tokenized assets. The challenge isn’t creating tokens – technology has made that relatively straightforward. The real challenge lies in distribution and creating actual use cases that provide tangible benefits over existing systems. Ferguson stressed that with countless assets that could potentially be tokenized and tons of different choices available, the industry must figure out how to unite the hype with practical application and bring all these elements together in a meaningful way. This honest assessment reflects a growing maturity in the blockchain space, where participants are increasingly recognizing that innovation for its own sake isn’t enough. The tokenization industry needs to demonstrate clear advantages over traditional systems, whether that’s in settlement speed, cost reduction, accessibility, or new capabilities that weren’t previously possible. Without this demonstrated utility, tokenization risks becoming just another buzzword that fails to deliver on its promise.
Regulatory Clarity: Walking the Tightrope
Ferguson emphasized the importance of not “jumping the gun on the regulatory side of things,” recognizing that regulatory compliance isn’t an obstacle to be circumvented but rather a necessary foundation for sustainable growth. There are encouraging signs that U.S. regulators, particularly the Securities and Exchange Commission (SEC), are beginning to understand that tokenization can form the fundamental plumbing of future markets and doesn’t necessarily mean creating “isolated compliance islands.” The SEC appears to be waking up to the potential benefits that tokenization can bring to market infrastructure. Ferguson pointed out that his company has been advocating for the benefits of improved settlement processes that tokenization enables, including programmatic compliance built directly into the token standard itself and the transferability of assets among individuals who have completed know-your-customer (KYC) verification. Securitize’s approach has always been to work in lockstep with regulators, operating as a regulated transfer agent and broker-dealer in both the United States and the European Union, doing things “by the book” as Ferguson put it. While this regulatory-first approach may seem conservative compared to some crypto projects, it’s proving essential for bringing institutional capital and traditional finance participants into the tokenization space, as they require the legal certainty and regulatory clarity that comes from proper compliance frameworks.
Two Different Approaches: Wrapping vs. Native Issuance
The tokenization industry has essentially developed two primary approaches, exemplified by the different strategies of Ondo and Securitize. Ondo’s focus is primarily on efficiency through what’s called a “wrapper model” – quickly and easily wrapping existing assets in a token representation. This approach allows for rapid deployment and scalability; Ondo was able to tokenize BitGo stock within just 15 minutes of the firm starting to trade on public markets. Lin explained that this wrapper model has enabled Ondo to scale quickly, with around 200 plus tokenized stocks and ETFs today, and plans to scale that to thousands. Lin drew a parallel to stablecoins, which are essentially wrapped U.S. dollars, suggesting that Ondo has adopted a similar, proven model. On the other hand, Securitize’s approach involves issuing securities natively on the blockchain and working through the jurisdictional compliance complexities associated with that process. This more comprehensive approach comes with its own set of challenges, particularly when working with decentralized finance (DeFi) protocols, because of the need to track who the beneficial owner of an asset is at every point in time. Ferguson acknowledged that in crypto and DeFi, the ecosystem is used to massive pools of assets, so Securitize is focused on finding ways to work with these protocols while implementing the tracking mechanisms required to trade and transfer securities, even if “it’s not necessarily the most DeFi comfortable approach.”
Permissioned vs. Permissionless: Finding the Right Balance
Lin outlined how Ondo’s tokenization products fall into either a permissioned or permissionless camp, depending on the asset type and regulatory requirements. For example, OUSG, the Ondo Short-Term US Treasuries Fund, is available to a global audience but is permissioned, meaning users can only transfer this asset to whitelisted addresses. This ensures compliance with securities regulations while still providing broader access than traditional finance typically allows. On the other hand, Ondo Global Markets tokenizes publicly traded U.S. stocks and ETFs using a permissionless approach following a given compliance period, though this product is only available to investors outside the United States due to regulatory considerations. The permissionless approach for certain products allows for free peer-to-peer transfer within DeFi ecosystems, enabling users to leverage DeFi protocols for lending and using tokenized assets as collateral margin. This represents a significant innovation – Ondo recently announced Ondo Perps, whereby tokenized equities can be used as collateral margin directly, rather than requiring users to convert to stablecoins as collateral on exchanges or decentralized exchanges (DEXs). This development demonstrates how tokenization can create new financial possibilities that weren’t available in traditional finance, where the separation between different asset classes created friction and limited composability.
The Road Ahead: Scaling Responsibly
The tokenization industry stands at a fascinating crossroads, with enormous potential on one side and significant practical challenges on the other. The market opportunity is undeniable – trillions of dollars in assets that could benefit from the improved settlement, programmability, and accessibility that blockchain technology enables. However, realizing this potential requires more than just wrapping assets in tokens or creating new blockchain-based securities. It demands a careful balance between innovation and regulation, between moving fast and building sustainably, between creating new possibilities and maintaining the investor protections that make financial markets function. Both the wrapper approach exemplified by Ondo and the native issuance approach exemplified by Securitize have their place in this emerging ecosystem, serving different needs and use cases. The wrapper model offers speed and scalability, allowing rapid tokenization of thousands of assets and quick integration with DeFi protocols. The native issuance approach offers deeper integration with regulatory frameworks and more comprehensive compliance tracking, which may be essential for certain institutional applications and regulated markets. As the industry matures, we’re likely to see both approaches coexist and evolve, perhaps even converge in some ways, as best practices emerge and regulatory frameworks become clearer. The key will be maintaining focus on genuine utility – solving real problems, reducing real costs, and creating real value for users – rather than getting caught up in hype cycles that promise transformation but deliver little substance. With regulatory clarity gradually improving and industry participants showing increasing sophistication about both the opportunities and challenges, tokenization may finally be ready to move from promise to practice.













