Introduction
If you type ‘tokenization’ into LinkedIn’s search engine,Footnote 1 you immediately come face to face with an array of promises, stories, and imaginaries about the revolutionary potential of digital- ledger technologies and especially payment token systems to replace or augment existing financial systems, processes, and infrastructures. Some people lay out the benefits of tokenization to financial markets, whether that is reducing fraud, speeding up settlement, improving compliance, and supporting scalability; others simply outline the tokenization process, such as how financial securities can be tokenized; and yet more people delve into the complexities of and between token types and their governance. Generally, though, there are two major commonalities across most of these LinkedIn posts: first, tokenization is presented as an inevitability that financial actors – whether central bankers, financiers, or asset-holders – will have to contend with in the coming years; and second, tokenization is presented as promising to democratize finance in unpredictable, although assumed to be beneficial, ways, such as by opening up access to capital and investment opportunities to non-financial actors (see also Schar, Reference Schär2021; Freni, Ferro, and Moncada, Reference Freni, Ferro and Moncada2022; Schwarcz and Bourret, Reference Schwarcz and Bourret2023).
Tokenization itself is a term used to denote ‘the process of generating [and recording] a digital representation of traditional assets on a programmable platform’ (BIS, 2023a: 1; see also BIS, 2024). Tokenization entails the use of distributed ledger (DL) technologies (e.g., blockchain platform) to create ‘tokens’ that are constituted as entries in a distributed database (i.e., ledger) and which contain both information (e.g., on transactions) and executable programs for issuance, recording, and transference of tokens on the platform. Consequently, tokens have to be understood as both information and application. Tokenization represents a second-generation DL technology system; specifically, a system in which commands are programmable on a blockchain conditional upon information both on-chain (i.e., an effect of information on the ledger) and off-chain (i.e., an effect of information outside the ledger) (Caliskan, Reference Caliskan2023). This sets tokenization apart from earlier blockchain applications, especially cryptocurrencies which can be defined as digital ‘native’ tokens; that is, cryptocurrencies (e.g., Bitcoin) are a fungible but digital-only medium of exchange or payment (SNA, 2025) with distinct materialities (Caliskan, Reference Caliskan2020).
Tokenization is a complex process, which we explain in subsequent sections, and can lead to significant confusion because of its diverse deployment and misunderstandings about the relationship between the technological and financial sides of the process. Advocates of tokenization, in particular, exhibit a certain technological enthusiasm, placing significant stress on the technological possibilities that DL technologies are expected to realize for releasing financial resources and opening up access to those financial resources beyond an expert financial elite.Footnote 2 For example, Larry Fink, CEO of BlackRock, states:
We believe the next step going forward will be the tokenization of financial assets, and that means every stock, every bond will have its own basically CUSIP [Committee on Uniform Security Identification Procedures]; it’ll be on one general ledger. Every investor, you and I, will have our own number, our own identification. We could rid ourselves of all issues around listing activities (…) But the most important thing, we could customize strategies through tokenization that fit every individual. (Bloomberg, 2024)
Tokenization is presented as a techno-financial process that will democratize finance, because the underlying DL technologies are framed and understood as inherently distributed, decentralized, and consensus-based; consequently, advocates think that DL technologies will undermine prevailing centralized clearing and settlement financial systems (BCG, Aptos Ascend, and Invesco, 2024). Furthermore, tokenization is presented as a way to improve the liquidity of financial, and non-financial assets, especially and increasingly real-world assets (RWAs) that can be represented via tokenized securities (The Block, 2024).
The growing popularity of tokenization as a techno-financial process reflects a broader political-economic trend characterized by the massive expansion of private capital markets (compared to public capital markets). A 2022 article in The Economist illustrates this trend, highlighting that the ‘net asset value’ of private capital has risen five times the value of public market capitalization between 2000 and 2020 (The Economist, 2022). According to EY, in 2023 private market assets totaled US$24.4 trillion globally.Footnote 3 A major driver of this expansion of private markets has been the zero interest rate policies pursued by a range of countries following the 2008 global financial crisis (Leonard, Reference Leonard2022). Investors have sought out alternative assets and asset classes to invest in, as the returns from safe investments (e.g., government bonds) have declined. The explosion in private finance is highly heterogeneous, including venture capital, private equity, private debt, real estate, and infrastructure assets; it is characterized by liquidity limits, dominance of sophisticated investors, and lighter financial regulation (Aramonte and Avalos, Reference Aramonte and Avalos2021). Defined by their heterogeneity, private markets leave considerable room for financial experimentation; this includes tokenization and the construction of diverse tokenized assets.
Our question for this article emerges from these issues: what are the implications of tokenization and the promise of techno-financial democratization of and for financial markets?
To address this question, we draw on the concept of assetization developed at the intersection of science and technology studies and political economy (Birch, Reference Birch2017; Birch and Muniesa, Reference Birch and Muniesa2020). Assetization is a useful analytical tool to unpack the diversity in tokenization that is sometimes obscured in both vernacular and academic discussions of DL technologies and their deployment in finance. Tokenization is used, for example, to refer to the construction of different digital assets, which can be simplistically split between payment tokens, utility tokens, and asset tokens – we could also add hybrid tokens to this list. Each type of token represents a different kind of digital asset, entailing different practical and analytical concerns. We specifically concentrate on ‘asset tokens’ in this article in order to focus our attention and analysis on those tokens that most closely resemble financial securities. We aim to unpack the construction, governance, and implications of asset tokenization within and for the financial system, illustrating, in particular, how DL technologies cannot and do not deterministically transcend financial laws and regulations.
Rather than technologically creating a new democratized financial system, in contrast to Bitcoin where peers processes exclusive right to transfer data as both representation and value through digital ledger technology (DLT) based accounting system without the supervision of an intermediary (Caliskan, 2020), tokenization increasingly entails the integration of DL technologies within existing financial systems, regulations, and infrastructures, generating contradictions in the process. What this might mean for the proposed benefits and promises of tokenization is an open question. To unpack this tokenization process more concretely, we draw on a number of in-depth qualitative interviews with financial professionals involved in the creation and trading of alternative assets in Europe. We use this empirical material as part of a mainly conceptual discussion about tokenization, its evolution, and the contradictions underlying the premise that it will lead to techno-financial democratization as more people gain access to both capital markets (for financing) and financial markets (for investment).
Techno-financial assetization
There has been considerable interest in the ascendance of the asset form across the social sciences, entailing a range of different analytical and empirical starting points. The starting point for this literature is the view that our societies and economies are increasingly defined by the asset form. This focus on the asset and the transformation of things into assets (i.e., assetization) can be seen in research on the centrality of housing to welfare systems or social reproduction (Doling and Ronald, Reference Doling and Ronald2010; Birch, Reference Birch2015; Adkins, Cooper, and Konings, Reference Adkins, Cooper and Konings2020); in the rise of intangible assets, such as knowledge, as the foundation of innovation and competitiveness (Rikap, Reference Rikap2021; Schwartz, Reference Schwartz2022); and in the transformation of digital data into a new asset class (Beauvisage and Mellet, Reference Beauvisage, Mellet, Birch and Muniesa2020; Birch, Cochrane, and Ward, Reference Birch, Cochrane and Ward2021). These are just a few topics dealing with assets. There is now a burgeoning literature examining this ‘asset shift’ and ‘asset condition’ in contemporary capitalism, concerned as much with the analytical implications of assets in our societies (e.g., Muniesa et al., Reference Muniesa, Doganova, Ortiz, Pina-Stranger, Paterson, Bourgoin, Ehrenstein, Juven, Pontille, Saraç-Lesavre and Yon2017; Birch and Ward, Reference Birch and Ward2024; Chiapello, Reference Chiapello2024; Tellmann, Braun, and Brandl, Reference Tellmann, Braun and Brandl2024) as with the normative, governance, or other practical implications of assets and assetization to how we collectively organize ourselves and our collective resources (e.g., Birch, Reference Birch2024; Doganova Reference Doganova2024; White, Reference White2024).
With regards to ‘finance’ as the unit of analysis, there has been a growing debate about how to integrate the emerging critical analytical takes on the asset form within ongoing debates about financialization (e.g., Langley Reference Langley2021; Golka, van der Zwan, and van der Heide, Reference Golka, van der Zwan and van der Heide2024). It is possible to trace the intellectual lineage of these current debates back to earlier literature, especially research on specifically financial assets and the rise and role of institutional investors. Writing in the 2000s, for example, Davis (Reference Davis2008) discussed the concentration of financial assets in the hands of a small number of institutional investors and the systemic instabilities this causes, while Wray Reference Wray(2010) drew upon Minsky’s ideas to outline the responsibility of ‘money manager capitalism’ for the 2008 Global Financial Crisis. Today, much of this debate centers on the rising importance of financial assets in ‘asset-manager capitalism’, research pioneered by Braun (Reference Braun2016a, Reference Braun2016b, Reference Braun2022).
Asset-manager capitalism tends to focus analytical attention on public markets through the concern with institutional investors. This is for good reason. Braun (2016a) argues that there was more interest in private capital markets in the early 2000s in critical studies of finance, which focused more on hedge funds, private equity, and venture capital. As with today, and despite considerable rise in private assets under management, these private capital markets represent a small proportion of total financial assets – today around 10–11 percent globally (The Economist, 2022). Like Davis (Reference Davis2008) before him, Braun (Reference Braun2016a) emphasizes the concentration of public capital markets in the hands of a few institutional investors, especially mutual funds like BlackRock. The growing interest in asset management, then, reflects a recentering of interest in the largest financial players (i.e., institutional investors) and their role in configuring wider capitalist dynamics, especially in public capital markets. This can be seen in Braun’s (2016b) discussion on the rise of passive investors, exemplified by the emergence of exchange traded funds (ETFs) in the early 1990s and their subsequent influence on financial markets. For Braun (Reference Braun2022: 631), financialization increasingly entails a shift in financial markets away from ‘financing’ (e.g., capital investment in businesses to expand production) towards ‘wealth preservation’ (e.g., capital investment in financial assets) underpinned by the structural power of a small number of asset management firms (e.g., BlackRock, Vanguard, or State Street).
Financial assets, on which asset-manager capitalism is based, are made – and made in particular ways. Almost anything can be configured as a (financial) asset with the right social, technical, and legal arrangement; it is this configuration that is conceptualized as a process of assetization (Birch, Reference Birch2017; Birch and Muniesa, Reference Birch and Muniesa2020; see also Pistor, Reference Pistor2019). With regards to finance, assetization provides a useful analytical lens to unpack the construction of financial assets and especially the range of new asset classes configured via tokenization. According to the International Financial Reporting Standards, financial assets include: cash, equity, contract rights to receive cash or exchange an asset or liability, and contracts that may be derivative or nonderivative and settled in own equity (Muc, Reference Muc2025). Our focus in this article is specifically on securities as financial assets; these can be defined as a right or claim (to future returns) that is embedded in an instrument that can then be owned and traded (Layr, Reference Layr2021). While some countries have sought to create clear definitions for what constitutes a security, others are more open, such as the USA. For example, US securities law relies on the so-called ‘Howey Test’, established by the US Supreme Court’s decision in the well-known case SEC v. W.J. Howey Co. (1946). This test identifies whether an instrument is an investment contract and therefore falls within the scope of the securities regime. Something is a security if it is: an investment of money in a common enterprise and from which investors expect to receive a return (e.g., dividends or capital gains) on the basis of managerial effort by others (Garrido, Reference Garrido2023: 43).
From the perspective of assetization, financial securities are analytically interesting because they are quite clearly socio-legal constructs (e.g., incorporating a claim) that also have a certain physical form (e.g., certificate document). Originally, securities certificates could be bearer instruments, belonging to whoever held them, or registered instruments, belonging to whoever was registered as the owner or rights-holder in the document or a central registry. Securities are also interesting because they enable the construction of a range of potential financial assets and asset relationships; they can be, in this sense, quite heterogeneous. Here, Pistor (Reference Pistor2019) makes a useful analytical intervention in discussing the heterogeneity of assets. She highlights four legal dimensions to assets – or ‘capital’ in her terms – covering priority, durability, universality, and convertibility. Priority represents the ranking of legal claims to an asset, such as property rights; durability refers to the extension of legal claims over time, including their manifestation as things (e.g., firm, instrument); universality refers to the extension of legal claims across parties, such that arrangements are honored and enforced; and convertibility refers to the conversion of an asset into fiat money (Pistor, Reference Pistor2019: 13–15). Durability fits into wider debates about the materiality and infrastructural dimensions of financial assets, and especially the reshaping of the governance of traditional securitization and securities clearance originally designed to enable the convertibility of physical property into capital, and back again, and therefore integral to capital mobility (Maurer, Reference Maurer1999). However, the blockchain infrastructure that enabled Bitcoin and similar native currencies endowed them with a historically distinct and categorically different materiality from both paper money and their digital representations (Caliskan, 2020). On the one hand, such cryptocurrencies are defined by something called unspent transaction output (UTXO) in which ‘unlocking conditions are directly stored on the specific asset’; which contrasts with later account-based models constituted by cryptographic registered accounts connected via smart contracts, similar to conventional financial markets (European Commission, 2024: 9).
In light of these discussions of the characteristics of assets, it is important to understand the social, technical, and legal arrangements that constitute an asset by examining the specifics of the assetization process; that is, what makes something into an asset. To examine this assetization process, it is useful to think about the techno-economic architectures that define financial (and other) markets. According to Callon (Reference Callon2021: 35), markets can be understood as an arrangement of sites, relations and encounters, mechanisms of competition, and the institutions that stabilize markets. Building on these ideas, Birch, Komljenovic, and Sellar (Reference Birch, Komljenovic and Sellar2025) argue that it is possible to analyze the ‘architectures of assetization’ representing the configuration and reconfiguration of different market arrangements such that they constitute a specific asset (and asset market). For example, Mützel (Reference Mützel2021) outlines the changing imaginaries underpinning payments and payment architectures over time that lead to the monetization of data as economic assets (see also Mützel and Unternährer, Reference Mützel and Unternährer2024); Sippel (Reference Sippel2023) shows how agricultural land markets in Australia are constituted and reconstituted by a set of imaginaries, data, and digital technologies; White (Reference White2024) argues that rental housing is increasingly disassembled and redefined as a ratio of beds to other living space (e.g., kitchen); and Rella (Reference Rella2023) demonstrates how changes in infrastructure and hardware architectures underpinning video game markets dynamically influence multiple industries and markets, including crypto assets and artificial intelligence. These are just four examples for understanding how assets are constructed and reconstructed in new ways through specific and evolving ‘market’ architecture. As such, the notion of architectures of assetization is analytically helpful for capturing the dynamic nature of assets and assetization, which can be missing in earlier conceptual framings (Callon, Caliskan, and MacKenzie, Reference Callon, Caliskan and MacKenzie2025).
When it comes to financial assets, these architectures of assetization not only reflect the transformation of things into assets; they also constitute a new form of governance via assetization (Birch, Reference Birch2024). Constructing a financial asset requires a particular architecture to configure financial claims as divorced from persons and incorporated in instruments, such as certificates. This architecture of financial assetization is dynamic and constantly changing as market actors experiment with new ways to shape the world to suit their goals; technological innovation is deeply implicated in such experimentation, being a major driver for changes to social structures, relations, and governance mechanisms. For example, Engels, Wentland, and Pfotenhauer (Reference Engels, Wentland and Pfotenhauer2019) argue that ‘experiments’ – especially living or real-world labs – represent a new form of governance mechanism in which experiments come to reshape the world as a way to adapt it to the emerging technology. The same implications for governance are present with tokenization; for example, tokens are deployed in experiments and regulatory sandboxes (e.g., Helvetia, Jura, Dunbar, and other projects – see BIS, 2024) as a way to reshape financial markets in order to adapt the financial system to DL technologies. As Proskurovska (Reference Proskurovska2023) shows, however, the results of this ‘co-designing activity’ have yet to meet the intended decentralization purpose: even when blockchain innovations are state-led, incumbent financial institutions still shape the specific framing and architecture of financial markets through cycles of (re)intermediation. Campbell-Verduyn (Reference Campbell-Verduyn2023) observes similar dynamics in blockchain climate finance experiments, showing how purported improvements can often, unexpectedly, reflect and extend existing centralized market-based climate governance arrangements, even in projects that claim to avoid centralization. The important analytical point we want to consider is how tokenization’s deployment shapes financial markets and financial governance. Does it lead to techno-financial democratization, as asserted by advocates, or does it lead to other, perhaps more problematic, outcomes?
Unpacking tokenization in financial markets
Within the context of financial markets, tokens are now understood and treated as digital representations of traditional assets that are generated and recorded on a programmable DLT platform (e.g., blockchain) (BIS, 2023a; BIS, 2024). As digital representations, tokens are entries in DLT databases (i.e., the ledger) that can be transacted on programmable DLT platforms through embedded and executable code – often associated with ‘smart contracts’ – which deals with the issuance, recording, and transfer of the token within the platform (and across platforms where the digital infrastructure exists). As a process, tokenization entails: (1) the generation and recording of information as an entry (or ‘token’) in a ledger/database; and (2) the deployment of executable code on a programmable platform that updates the ledger/database according to ‘a pre-defined logic’ (BIS, 2024: 7) or set of protocols (FSB, 2024). As an asset, a token’s features are defined both by information (e.g., entry in a database) and application (e.g., programmed logic). Tokens become assets via their origins in and interaction with DLT platforms; generally, a DLT platform is defined by a ‘core’ layer constituting the database/ledger ‘containing information about tokenized asset and its ownership’ and a ‘service’ layer ‘embedding the platform’s rules and governance’ (BIS, 2023b: 1).
Tokenization is not homogeneous; there are a number of different token types, reflecting different technological and political-economic characteristics as well as governance configurations (see Freni, Ferro, and Moncada, Reference Freni, Ferro and Moncada2020; Heines et al., Reference Heines, Dick, Pohle and Jung2021; Freni et al., Reference Freni, Ferro and Moncada2022). Part of the difficulty with defining and understanding tokenization is the everyday conflation of one token type (e.g., cryptocurrency) with another token type (e.g., crypto asset). However, most financial institutions and regulators trying to develop techno-financial architectures to deal with tokenization differentiate between the following token types: (1) payment tokens, (2) utility tokens, (3) asset tokens, and (4) hybrid tokens (e.g., Savelyev, Reference Savelyev2018; Garrido, Reference Garrido2023; Favre and Elsener, Reference Favre, Elsener, Richter and Artzt2024; SNA, 2025). A key set of issues across these token types has been the lack of legal recognition regarding tokens being treatable as legal tender or even objects (Layr, Reference Layr2021); smart contracts as legal contracts (Garrido, Reference Garrido2023); and the tax implications of tokenization (Fintech Executive #1, 2024). Jurisdictions have had to supplement their financial laws and regulations in order to address these ambiguities; for example, the DLT Act in Switzerland, TVTG Act in Liechtenstein, and MiCAR in the EU. These regulations tend to categorize tokens based on their function and purpose. For example, the Swiss Financial Market Supervisory Authority (FINMA) identifies three main types of tokens (FINMA, 2018). Payment tokens are intended primarily as a means of payment and do not grant access to specific applications or services. Utility tokens provide digital access to a blockchain-based application, product, or service. Asset tokens, on the other hand, represent ownership rights or claims to RWAs – such as shares, debt, or profit participation – and are comparable to traditional financial instruments (e.g., securities).
Payment tokens are defined as ‘native’ tokens in that they are digital records that do not have a relationship with a right or asset outside the specific digital ledger on which they are generated; they are digitally native in both their characteristics and functions. They exist on a single platform, which can be permissionless and accessible to all. Payment tokens are most often associated with cryptocurrencies (e.g., Bitcoin), which can be used as money where their value is really only an effect of people’s collective belief in that value (Zook and Blankenship, Reference Zook and Blankenship2018; Garrido, Reference Garrido2023). Other tokens use smart contracts to connect rights to things or functions; in the case of utility tokens, this means the right of access or use to a (digital or non-digital) good or service provided by an issuer. Unlike payment and utility tokens, asset tokens reflect the use of DLT to represent an asset or right (beyond access); these assets and rights can be digitally native (e.g., on-chain claims to digital things), or non-native (e.g., off-chain debt or equity). For our purposes in this article, we are most interested in asset tokens that represent non-native assets or rights to RWA, since these token types are most relevant for changes in financial markets.
Increasingly, financial regulators apply traditional models to these asset tokens, treating and trying to regulate them as securities. Thus, the EU’s Markets in Crypto Assets Regulation (MiCAR), adopted in 2023, classifies crypto assets into three main categories: (1) e-money tokens, which reference only one official currency; (2) asset-referenced tokens, whose value is backed by a basket of assets; and (3) other crypto assets, including utility tokens, that do not fall into the previous two categories. Notably, MiCAR excludes security tokens and non-fungible tokens (NFTs) from its scope. In contrast, tokens that qualify as transferable securities or other MiFID II (Markets in Financial Instruments Directive) financial instruments are subject to the full suite of EU securities laws, including the Prospectus Regulation, licensing requirements, and post-trade obligations under the Central Securities Depositories Regulation (CSDR).
A similar approach has been taken by the Bank of International Settlement (BIS) as reflected in its 2023 and 2024 publications. As cited above, the BIS defines tokenization in terms of the digital representation of real or financial assets, treating tokens not as inherently novel instruments – despite acknowledging that tokenization first emerged in crypto asset markets – but as claims on, or representations of, traditional assets, particularly in the context of money and regulated financial markets (BIS, 2024: 7). Arguably, this treatment of tokens has been largely inspired by the USA’s Securities and Exchange Commission (SEC) decisions in response to a growth in initial coin offerings (ICO). These ICOs – a fundraising method based on the sale of tokens marketed as investments in blockchain-based projects – initially operated outside of the regulatory perimeter (Hu, Parlour, and Rajan, Reference Hu, Parlour and Rajan2019), but were subsequently classified as securities offerings, subjecting them to federal securities laws.
As noted above, a security is a right that inheres in an instrument (e.g., certificate), rather than a person, and that right can be traded and transferred between people as that instrument (Layr, Reference Layr2021). Asset tokens, then, can be considered securities where they fit this definition, representing a claim against a third party (e.g., issuer) or claim against an underlying asset (e.g., dividend or interest payment) that can be transferred digitally via a DLT platform. Being subject to securities law, in particular, means that asset tokens cannot be considered as distributed or decentralized financial instruments since the issuers have to comply with a range of financial requirements, which we outline below when considering the construction of an asset token. For example, the issuance of asset tokens requires abiding by know-your-customer (KYC) and anti-money laundering (AML) regulations, which militates against anonymity. Moreover, in order to be transferable, asset tokens are configured by socio-technical requirements to connect the token with a RWA (e.g., custodian arrangements) and to enable the transfer of the token across DLT platforms (e.g., necessitating platform ramps and bridges). Again, these militate against the anonymity and frictionless promise of DLT presented by advocates.
When it comes to asset tokens, it is worthwhile outlining the tokenization process to highlight some of its complexities; here we draw on our interviewee materials (Fintech Executive #1, 2024; Fintech Executive #2, 2024; Digital Markets Infrastructure Executive #1, 2025; National Bankers Association Representative #1, 2025). The starting point is issuance, which entails a decision about whether to issue a token for a small ‘retail’ market (i.e., for defined individuals), or for the general ‘wholesale’ market (i.e., for financial institutions). The latter requires adherence to a range of financial regulations, such as KYC and AML. Issuers work with investment banks to structure a financial product that underlies the token/security, developing a prospectus or term sheet to define the terms and conditions, what the asset is, and how the asset will behave (e.g., ‘does it pay interest’). To constitute something as an asset entails an array of financial market infrastructures, including standards such as ISIN/CUSIP standards (International Securities Identification Number/Committee on Uniform Security Identification Procedures) assigned to new securities and necessary for them to be traded on secondary markets, as well as the CASP (Crypto Asset Service Provider) license for the issuer; registries and registrars to record the asset (e.g., DLT); custodians to immobilize the asset so that it cannot be traded once it is tokenized; centralized securities depositories to enable the easy transfer of an asset; verification and ID of third parties to meet KYC, AML, etc. requirements; asset servicing entities to ensure the disbursement of asset returns (e.g., interest, dividends); oracles to collect and provide information about off-chain events and entities; and financial and token standards such as ERC to specify the token/security’s characteristics (e.g., jurisdiction of origin). On top of all that is the financial market infrastructures needed for settlement (i.e., payment for the financial instrument), which is where tokenization is supposed to come into its own in terms of benefits for financial market actors.
Technologically, tokens and tokenization enable a range of new possibilities in finance, especially through ‘programmability’ and ‘composability’, the latter of which entails the bundling of transactions and transaction terms. Programmability enables the automation of transactions in response to certain conditions. ‘Internal’ programmability entails programmed transactions between users of the same platform, while ‘external’ programmability enables the automation of transactions between platforms – usually using application programming interfaces (APIs) or other mechanisms (BIS, 2024). Conditions can also be either ‘on-chain’ – using smart contracts within or across DLT platforms – or ‘off-chain’ – reflecting real-world events; in the latter case, there needs to be necessary architectures to ensure that the relevant conditions relate to the relevant transactions (Uzsoki, Reference Uzsoki2019). Composability entails the capacity to ‘combine features and functionalities in new ways’ (FSB, 2024: 7), which enables the bundling of complex arrangements and functions in a token. There is still a need, however, for third parties when it comes to certain functions, especially token custody and redemption; for example, tokens of RWAs might require the physical custody of said RWA (e.g., artwork) so that it cannot be used in other, real-world or digital, transactions (e.g., resold). Similarly, DLT platforms need to ensure internal consistency by ensuring that a token cannot be used in multiple instances in the same way; for example, as collateral in more than one instance (IMF, 2025).
Regarding financial governance, tokenization can entail the use of ‘permissioned’ and ‘permissionless’ DLT platforms. Permissioned platforms are usually approved platforms whose members are defined and verified, especially through formal KYC financial standards – for onboarding new buyers and sellers – as well as AML financial standards – for transactions. Simplistically, permissionless platforms represent the conventional view of blockchain as a distributed, decentralized, and autonomous governance mechanism, in which anyone can participate – usually anonymously (FSB, 2024). This is mostly associated with cryptocurrencies like Bitcoin, and other forms of native token. Despite contentions otherwise, especially by DLT advocates, tokenization in financial markets still requires the involvement of third parties, including when it comes to token issuance, transactions, and settlement – so beyond the need to conform to existing financial system rules around KYC and AML. Non-native tokens, for example, still require: custody parties to immobilize assets (Digital Markets Infrastructure Executive #1, 2025); oracles to collect and store the data used in smart contracts on DLT platforms; and ramps to enable transactions across DLT platforms (BIS, 2023a; FSB, 2024). According to the IMF (2025), then, it is better to frame tokenization as reducing intermediary costs rather than eradicating them entirely; this is especially true when considering the financial governance of tokens and the financial systems and structures required to successfully deploy tokenization within finance.
Tokenization is usually presented as creating a range of benefits for financial market operators, of which we consider a few here: namely, atomic settlement, reducing counterparty risk, market efficiency, investment opportunities, and transparency. Atomic settlement, or atomicity, refers to the simultaneous settlement of a transaction (FSB, 2024; IMF, 2025). Non-digital securities transactions often require intermediaries to finalize the transaction; these intermediaries include central securities depositories and centralized clearinghouses. Delivery versus payment (DvP) settlement processes require the deposit of the security with the intermediary (e.g., clearing and depository service); the security is released once the payment obligation is made to the intermediary (IMF, 2025). Atomic settlement removes the need for such intermediaries as settlement can be instantaneous and simultaneous between the transacting parties (FSB, 2024). Atomic settlement promises to reduce counterparty risk; that is, the risk that one side of the transaction reneges on the deal. Market efficiency is premised on tokenization speeding up transactions and settlement through simultaneous and instantaneous transfer; that is, the transfer of an asset, its recording on ledger, and the atomic settlement for the transaction (FSB, 2024). This promises to reduce delays in the system, which result from the need to synchronize across multiple parties and architectures (e.g., ledgers, registrars), and ultimately to reduce the need for and number of intermediaries involved in securities transactions, which currently is over 10 (BIS, 2023a; IMF, 2025). Lastly, transparency relates to the recording of information on a ledger, which is not only expected to create an accurate (because immutable) and reliable record but also integrate recording and transactional validation in a single process (BIS, 2023a; FSB, 2024).
Problematizing tokenization in financial markets
Despite this complexity surrounding tokenization, consultants like McKinsey continue to emphasize the beneficial features of tokenized assets as ‘24/7 availability, instant global collateral mobility, equitable access, composability through a common technology stack, and managed transparency’ (Banerjee et al., Reference Banerjee, Sevillano, Higginson, Rigo and Spanz2024). However, the benefits of tokenization are often discussed in terms of future potential, which remain subject to revision. For instance, in 2022, Boston Consulting Group forecasted that by 2030, the tokenization of global illiquid assets could represent a $16 trillion business opportunity (Kumar et al., Reference Kumar, Liu, Kronfellner and Kaul2022). However, just three years later, McKinsey offered a more modest estimate, suggesting that the tokenized market capitalization across asset classes could reach about $2 trillion by 2030, and this is excluding cryptocurrencies and stablecoins. These ambiguities are hardly surprising given many shifts in relation to the scope of tokenization as well as contradictions embedded in its governance architectures. In fact, even today it is not always clear what kinds of processes or assets people refer to when they talk about ‘tokens’.
One reason for this is that tokenization is dynamic and has evolved over time alongside decentralized finance (DeFi), a financial movement that positioned itself as an alternative to traditional financial systems and their centralized intermediaries (e.g., clearing banks or regulators). DeFi can be defined as ‘an alternative financial infrastructure that is open, permissionless, and interoperable, built on public blockchains such as Ethereum’ (Bok, Reference Bok2024: 9). Here, the Ethereum community is singled out for a reason: its founders were among the first to leverage blockchain technology to create ‘additional assets’ beyond a native cryptocurrency. However, over time these ‘new’ crypto assets started to resemble native currencies only in name. Caliskan (2020) provides a detailed explanation of how decentralized cryptocurrencies were eventually complemented by a different type of crypto asset form that was largely ‘unburdened’ by the distinctive materiality of its predecessors. Owing to space limitations, we focus on key elements and events that contributed to the ambiguities around tokens.
Thus, unlike Bitcoin which was designed as a decentralized digital currency with limited scripting and used a recordkeeping format known as the UTXO, the Ethereum community introduced a general-purpose, Turing-complete platform that enabled the creation of smart contracts (Aumayr et al., Reference Aumayr, Ersoy, Erwig, Faust, Hostáková, Maffei, Moreno-Sanchez and Riahi2021). As mentioned above, smart contracts are executable code often stored on the Ethereum blockchain that can define their own internal logic for tracking user balances. Ultimately, they can function as sub-ledgers allowing any tokens on Ethereum to be reused by other applications from wallets to decentralized exchanges (Cuffe, Reference Cuffe2018). The ERC-20 standard,Footnote 4 adopted shortly after Ethereum’s launch, provides a consistent set of rules for creating and managing such tokens. When an ERC-20 token is issued, those acquiring them can be sure that all tokens governed by the same smart contract share identical properties and value. In this way, an ERC-20 token functions like a cryptocurrency within its own type, meaning that ‘1 token is and will always be equal to any other token of the same kind’ (Ethereum Foundation, 2025). Thus, unlike systems that focused solely on native cryptocurrencies, Ethereum enabled the creation and management of custom assets which then started to be called tokens (Lee, Malone, and Wong, Reference Lee, Malone and Wong2020).
The transfer of such custom assets is facilitated by cryptocurrency wallets, which hold the private keys that allow users to authorize transactions on the blockchain ledger. This authorization process is similar to placing a signature on a check: it enables the transaction but, in itself, does not constitute a digital object or a form of account balance. What takes place is essentially a quasi-verification of identity that is established based on control of a private key. The birth of this new type of crypto asset introduced the ‘tokens vs. accounts’ dichotomy, marking a fundamental and unprecedented material shift in blockchain architecture (Lee et al., Reference Lee, Malone and Wong2020; European Commission, 2024). Lee et al. (Reference Lee, Malone and Wong2020) explain that in the Bitcoin network, ownership of native coins is determined by possession, using the UTXO model. In this UXTO system, control over a fixed supply of digital ‘money’ is tied to private keys that authorize spending specific ‘unspent’ outputs. By contrast, Ethereum and later blockchain ledgers use a system that is conceptually similar to user accounts in traditional finance in which ownership is established through verified identities, which is why transactions require authorization from the account holder whose Ether balanceFootnote 5 is publicly associated with their address (Garratt et al., Reference Garratt, Lee, Malone and Martin2020). This is a central logic to account-based payment systems such as traditional bank deposits.
Introducing this hybrid approach that leveraged the strengths of both models marked a boom in the issuance of standardized tokens in the mid- to late-2010s. Such tokens could represent a wide variety of rights, utilities, or assets, a scope limited only by the issuer’s design (Hu et al., Reference Hu, Parlour and Rajan2019). Over time, these developments have fueled the rapid growth in cryptocurrency exchanges: such platforms offered additional liquidity and anytime-exit opportunities, expanding the pool of investors and driving up exchange values, with crypto market capitalization reaching $3 trillion by 2021 (Bloomberg, 2021). Eventually, this boom came to an abrupt halt following a series of high-profile events and scandals (Campbell-Verduyn and Hütten, Reference Campbell-Verduyn and Hütten2019), including major protocol failures that exposed the flaws in DeFi but also the intervention of centralized regulators in the crypto markets (Makridis et al., Reference Makridis, Fröwis, Sridhar and Böhme2023). The interest and intervention of regulators draws a line separating tokenization as we know it today and the earlier, decentralized and experimental DeFi, marking one limit to blockchain’s democratizing promise, which was premised on being outside the trusted but centralized architecture of traditional finance.
Arguably, these dynamics in the emergence of tokenization started with the rise to prominence of a novel fundraising mechanism known as ICOs. ICOs offered the possibility of investing in custom assets that represented the rights to purchase a product or service, once it is on the market, in exchange for some form of currency, often Ether or Bitcoin or even a fiat currency (Oren, Reference Oren2018; Momtaz, Reference Momtaz2019). The ICO frenzy began around 2015, as projects like Ethereum successfully launched their fundraising campaigns, marking a shift away from the earlier model of value distribution (i.e., Bitcoin mining). To reiterate, despite the reference to coin in the term, most ICOs did not involve the sale of standalone cryptocurrencies functioning on their own blockchains, but rather the presale of custom assets, or tokens. Since most tokens issued during ICOs do not confer ownership rights but instead provide access to a project’s future products or services, many investors were motivated by the belief that ICOs offered an opportunity to profit by selling tokens at a higher price – either shortly after the ICO through listings on crypto exchanges, which provided unprecedented liquidity, or over the longer term, as token values rose in line with the project’s adoption and success (Momtaz, Reference Momtaz2019).
One of the earliest and most infamous ICOs was launched to raise capital for an experimental decentralized autonomous organization (DAO). ‘The DAO’, as it was called, aimed to address principal-agent problems, perceived as inherent in traditional corporate governance, by using an interconnected system of smart contracts to automate decision-making and funding processes (Cuffe, Reference Cuffe2018). The DAO’s code required a native coin – Ether, in this case – to engage in transactions on Ethereum to receive Ethers. In exchange, The DAO’s code created DAO tokens that were to be assigned to the account of the person who sent the Ether (Jentzsch, Reference Jentzsch2016). This operation granted its holder voting and ownership rights, which could be freely transferred on the Ethereum blockchain, once ‘the creation phase has ended’ (Jentzsch, Reference Jentzsch2016). In other words, The DAO’s code could receive, hold, and transmit Ether based on proposals approved by token holders, who held their DAO tokens in wallets that enabled self-custody through private keys and used them to vote on funding decisions – all of which were transparently recorded on the Ethereum blockchain. According to an SEC report, in only one month, The DAO offered and sold approximately 1.15 billion DAO tokens in exchange for a total of approximately 12 million Ether (‘ETH’), which represented a value of approximately $150 million (SEC, 2017).
After these DAO tokens were sold, a hacker exploited a flaw in The DAO’s code to drain Ether from the organization – an event that had far-reaching consequences for the crypto ecosystem (Stabile, Prior, and Hinkes, Reference Stabile, Prior and Hinkes2020). Thus, after investigating the case, the SEC determined that DAO tokens were ‘investment contracts’ classifying them as securities. Concluding that the sophisticated automation of decision-making and the virtual nature of the organization did not place its conduct beyond the reach of US securities laws, the SEC introduced securities regulators to the crypto world (SEC, 2017). Since all securities offered or sold in the USA must be registered with the SEC – or qualify for an exemption – and any entity acting as an exchange must register or operate under an exemption, this and subsequent regulatory actions not only curtailed the explosive growth of ICOs but also challenged the widely promoted narrative among blockchain enthusiasts that seamless 24/7 trading and enhanced liquidity are inherent to all tokenized assets (Stabile et al., Reference Stabile, Prior and Hinkes2020). Ultimately, these developments made it clear that such benefits are neither universal nor guaranteed, fueling efforts to align DeFi with traditional financial systems. However, these latter efforts did little more than conceal underlying contradictions, often suffocating ventures that did not or could not comply – all in the name of consumer protection (Guseva, Reference Guseva2022).
Contradictions in techno-financial democratization
Much of the discussion by traditional financial actors (e.g., regulators) about the limits of DeFi’s overall economic benefits centers on the interrelated contradiction in the governance mechanisms of tokenization, lack of KYC/AML compliance, and complexities of account management. These necessarily impact the promise of democratization touted by DeFi and tokenization proponents. A Staff Analytical Note by the Bank of Canada outlines how these two issues relate to one another:
the specialized knowledge required to manage private keys and interact with the blockchain makes it difficult for retail users to participate in the DeFi system directly. As a result, centralized, non-transparent and unregulated intermediaries called centralized finance (CeFi) have emerged. (Chiu and Yu, Reference Chiu and Yu2023)
Furthermore, while DAO organizational structures,Footnote 6 in part at least, have mitigated principal-agent problems – perceived as common in traditional corporations – they cannot avoid another fundamental trade-off with which traditional firms must contend: conflicts of interest between large token holders, or ‘whales’, and small token holders, given that making critical decisions or developing protocols still relies on few quasi-centralized bodies (Han, Lee, and Li, Reference Han, Lee and Li2025). Indeed, in its early stages, tokenization was undeniably closely linked to the ecosystem largely composed of DAO founders and developers, enabling peer-to-peer transactions governed by decentralized protocols (Alawadi et al., Reference Alawadi, Kakabadse, Kakabadse and Zuckerbraun2024). While these systems did not entirely eliminate the need for centralized parties like custodians and underwriters to maintain relations to the assets, they managed at least to engage a wider range of market participants, intermediaries, and end users ‘in a more decentralized manner than traditional finance’ (Schwiderowski et al., Reference Schwiderowski, Pedersen, Jensen and Beck2023). Still, as one interviewee put it, the failure of DAOs to ‘create the whole ecosystem of links between the investor and the asset’ is proof that the replacement of traditional custodians enabling ‘trust towards the whole ecosystem’ is unrealistic (National Banking Association Advisor #1, 2025). According to this interviewee, this failure hardly means that ‘the whole idea, the will, and the wish to have peer-to-peer transactions has faded away – not at all’. The ongoing domestication of DeFi by traditional financial institutions may entail a series of inherent contradictions that cast doubt on the realization of its much-touted promise of democratization.
Indeed, the mechanisms underpinning the circulation of tokens represents an intriguing opportunity for fixing the settlement of traditional securities that typically involve two major stages: the exchange of securities and the exchange of payments. As an interviewee points out, the payments stage is a major ‘pain’ point, because in order to settle payments at least several days have to pass between ‘the actual booking of the transaction and its actual settlement’ (National Banking Association Advisor #1, 2025). This specific interviewee explains that the European Central Bank (ECB), along with several other European central banks, developed blockchain-based platforms to speed up settlement; the ECB sought to simulate and trial a ‘wholesale’ central bank digital currency (CBDC) – a token representing the euro to be used by banks and other financial institutions – before opening those kinds of trials to all licensed financial institutions, which at least hold authorization to conduct operations such as the settlement of securities.
Clearly, this sort of trial excludes unlicensed private financial institutions, as, since 30 December 2024, custody and administration of crypto assets qualify as a regulated service under MiCAR, requiring a CASP license (Swertvaeger and Robert, Reference Swertvaeger and Robert2024). Moreover, the use of CBDCs in the settlement of securities also effectively contradicts the core principle of decentralization that underlies native tokens on public blockchains. Although CBDCs may resemble native tokens in form, they are meant to be fully controlled by the ECB, effectively reappropriating Ethereum’s accounts-versus-tokens architecture in a reversed, centralized fashion. In light of the ongoing competition between national currencies and private cryptocurrencies or stablecoins, such as Tether (USDT) and USD Coin (USDC) (Beja and Correia Barradas, Reference Beja, Correia Barradas, Pereira Coutinho, Lucas Pires and Correia Barradas2024), this exercise in control by the ECB represents not only a move to domesticate blockchain architecture but also a strategy for reasserting monetary regulatory authority in the crypto world via the reintroduction of identity-linked, KYC/AML cleared-account-based ownership structures in a system originally designed for pseudonymous, peer-to-peer value transfer. Consumer protection seems to be the primary consideration here, but CBDCs also entail the possibility of using DLT systems and architectures as a means to eliminate the anonymity traditionally afforded by cash, since every CBDC token carries a traceable transaction history. This traceability could bring atomic settlements to a new level of efficiency and allow for real-time DvP in securities markets – albeit under centralized control. The EU mandates free Digital Identity Wallets for all citizens, suggesting that public and private Services must accept the EU Digital Identity Wallet for Authentication in 2026 (European Commission, 2025a).
The question remains how these initiatives will impact the technological solutions already on the market, developed by less conventional actors operating outside centrally regulated exchanges. For instance, several players have built compliant applications that support various phases of the securitization process, including the acquisition and custody of securities. Many of these solutions use Ethereum-based smart contracts and standards to reduce compliance costs by streamlining investor whitelisting, enabling issuers and service providers to efficiently verify investors and potentially lower barriers to broader market participation. The ERC-3643 standard is one example, which is described as ‘a token protocol for conventional assets via permissioned tokens’, including RWA (Technology Working Group, 2024: 27).
By embedding compliance rules directly into the token, such tokens ensure that only eligible participants can buy and transfer them and correspondingly the RWAs they represent, such as tokenized securities. This protocol represents a suite of smart contracts, which is ‘compatible with any other tokens… it’s basically an extension of ERC-20, and thanks to this, it’s compatible with all the DeFi protocols on the wallets, et cetera, except that the tokens are permissioned’ (Fintech Executive #3, 2025). Issuers can use it to deploy and configure all necessary smart contracts enabling verification of identity (e.g., KYC/AML) by integrating with a compliant blockchain-based identity management system. Here, governance architectures are integrated into the entire lifecycle of security tokens, including issuance – which can be automated into a single transaction, saving time and reducing gas feesFootnote 7 – transfer between eligible investors, enforcement of compliance rules, and even token recovery in cases of lost access.
For ERC-3643 to become a recognized market standard, its backers had to open-source their smart contracts, submit them to the Ethereum community, and follow the process known as Ethereum Improvement Proposal (EIP) (Becze and Jameson, Reference Becze and Jameson2015). As explained by Fintech Executive #3, the process follows multiple stages that lead to the acceptance by ‘people from the Ethereum Foundation’, which is a nonprofit supporting the Ethereum ecosystem, without any involvement of traditional regulators: ‘at some point, we realized that, wait a minute, there’s only eight accounts who can validate. So, it didn’t sound extremely decentralized’. The ERC-3643 standard has also entailed the establishment of a nonprofit association ‘to guarantee the future development of this protocol’. Today, the association is composed of ‘a bit more than 100 partners, large law firms, financial institutions, web3 players, and several tokenization platforms’ whose collective work is essential to facilitate acceptance of native tokenized securities to legacy infrastructure such as T2S,Footnote 8 central securities depositories (CSDs),Footnote 9 and custodians yet to come (Tokeny, The LHoFT, and Luxembourg Blockchain Lab, 2023). However, infrastructure components such as CSDs remain as ‘basically the place where financial instruments, and we call them securities are born, so to say, and deposited, yes, basically held in custody and deposited’ (Digital Market Infrastructure #1, 2025). This interviewee goes on to say, ‘on the back of that, you create bits and bytes, tokens, basically. So that can be then, that gets assigned a so-called ISIN, so an identification number’. All of this enables bankers and the financial sector to treat these tokens as any other financial asset.
Consequently, and with regulations such as MiCAR coming into force, the power of these token governance models as well as the potential of these models to democratize retail investment appears increasingly uncertain: ‘if you want… retail client, you want to have the retail client to participate in decisions, you want to work with community and you want to give them access to secondary market, then you have tokenized project’ (Alternative Asset Executive #1, 2024). Many jurisdictions still restrict the custody and administration of tokenized securities to licensed financial institutions. This includes the right to maintain official securities registers, manage ownership records, and hold assets on behalf of investors – functions that are often off-limits to technology providers or non-bank entities. Now MiCAR has not just changed these rules. If anything, it has made compliance more challenging by equalizing tokenized securities with traditional securities: ‘it’s still the bank that can make the register. The problem is that we see that the banks don’t accept the money from our token holders (…) the traditional sector is one of the blocking points for the tokenized project to get through’ (Alternative Asset Executive #1, 2024).
More importantly, the combined regulatory demands of MiFID II and MiCAR have made it increasingly difficult to extend tokenization to retail markets within the EU. While issuance is feasible, enabling secondary market trade – crucial to the promise of democratized finance – triggers MiCAR obligations related to crypto custody, trade execution, and platform operation. Indeed, since settlement often involves crypto assets like ETH or stablecoins, and participants trading ERC-based tokens must pay gas fees to interact with the blockchain, these platforms fall under CASP licensing requirements. The resulting compliance and infrastructure costs may be manageable for institutional actors but are largely prohibitive for retail-focused platforms promising investment opportunities starting as low as $100. As Fintech Executive #3 put it:
the idea is a bit about democratization of access, sometimes’, but ‘it’s not because you tokenize that you can suddenly reach anyone you want. There are securities laws. In my opinion, they are stupid. If you are not rich enough, you cannot invest (…). So sometimes they call it accredited investors or qualified investors or eligible or whatever. But usually, you have a minimum and it’s not because you tokenize that you can bypass the securities regulations. It would still be complex to target retail investors. It works only with large funds. It’s more costly, basically, to set up.
In practice, this pushes tokenized securities into the domain of private markets, sidelining the original DeFi vision of open, low-barrier access to financial products: ‘you have retail, it’s very complex anyway, not really worth it, but everything in between, family offices, high-net-worth individuals, investment funds, they don’t know [how] to invest and they don’t want to be blocked for 10 years in a PE [Private equity] fund’ (Fintech Executive #3, 2025). Another interviewee echoes this perspective: while they acknowledge that tokenization is ‘a bit about democratization of access, sometimes’, they suggest it is also, in fact, ‘about redefining the borders of what is capital markets, what circulates in the financial system, and what makes an asset bankable’ (Digital Markets Infrastructure #1, 2025).
Despite the promise of financial democratization, the power balance within the techno-financial system has not really shifted. Tokenization has increasingly outgrown its DeFi origins and entered the mainstream financial system. This means that it is no longer governed primarily by a marginal crypto-libertarian community, as was the case in the early days of cryptocurrencies (Caliskan, Reference Caliskan2023). Today, the governance architectures of tokenization are increasingly shaped by traditional institutions and powerful regulators, including the USA’s SEC, the EU, the ISIN system, and the American Bankers Association (Stabile et al., Reference Stabile, Prior and Hinkes2020). Moreover, these more conventional market architectures are now increasingly taken into account by decentralized blockchain communities, including token issuers (see Tian et al., Reference Tian, Lu, Adriaens, Minchin, Caithness and Woo2020), as they develop and adapt technical standards – such as ERC-1400 and ERC-3643 – designed to ‘bridge’ both domains (Fracassi, Khoja, and Schär, Reference Fracassi, Khoja and Schär2024). This tilt in the balance of power is further underscored by the growing involvement of global bodies in tokenization governance, such as the G20 which recently tasked the BIS and the Committee on Payments and Market Infrastructures (CPMI) with investigating the implications of tokenization for money and markets (see BIS, 2024). Citing a surge in regulated tokenization projects, this investigation concluded that tokenization is becoming increasingly relevant for central banks. The notion of tokenized assets as an ‘alternative’, ‘decentralized’, and ‘democratic’ option appears increasingly ambiguous, especially as tokenization is aligned with the mechanisms and logic of traditional finance, thereby opening the door to a new community of actors.
Conclusion
Blockchain technologies and its two most well-known creations – cryptocurrencies and tokens – were designed to address perceived weaknesses in traditional finance, particularly fears about the dominance of centralized intermediaries such as clearing banks, custodians, and governments. Today, blockchain has become a cornerstone of a broader financial movement known as DeFi. The term DeFi is often used interchangeably with various Ethereum blockchain-based applications, but broadly, we have understood it to mean a decentralized ecosystem where financial services are provided by multiple market participants, rather than a single authority (Schwiderowski et al., Reference Schwiderowski, Pedersen, Jensen and Beck2023). Many have framed DeFi as a means to democratize finance and promote financial inclusion on a global scale (Vasishta et al., Reference Vasishta, Dhiman, Smith and Singla2025).
DeFi is underpinned by tokenization as the technological means to democratize finance. As a process, tokenization arose in the context of rising private capital markets that grew significantly following the 2008 Global Financial Crisis and the policy response of quantitative easing. Tokenization offers the possibility to transform a range of illiquid assets – and other things – into customizable asset forms (Birch and Muniesa, Reference Birch and Muniesa2020), promising to increase liquidity, the ease of transaction, and to bypass supposedly costly intermediaries in the financial system. For example, traditional finance requires a range of intermediaries such as depository registries, centralized clearinghouses, custodians, and so on; initially, tokenization promised to bypass this financial architecture via the installation of a simplified techno-financial architecture based on DLT. Tokenization, though, has faced challenges to its legitimacy as a governance architecture, especially from central regulatory agencies and governments, which triggered yet another round of reintermediation.
In seeking to answer our research question in this article, we have been concerned with understanding the implications of tokenization and its promise of techno-financial democratization for financial markets. Due to the heterogeneous qualities and function of tokens, which can create a wide range of custom assets, we focused our attention on the implications of asset tokens to financial markets. Asset tokens represent non-native rights or assets that can cross DLT platforms and digitally represent real-world things. As such, they are increasingly being framed and treated like financial securities; consequently, their unique characteristics are being reduced to the idea that they are ‘digital representations of traditional assets’ that are simply generated and recorded on a technological platform (e.g., BIS, 2023a; BIS, 2024; FSB, 2024). In our view, this illustrates a convergence between tokenization and securitization over time (Bianchi, Reference Bianchi2024), a process that on the one hand has aligned the regulatory requirements of securities (e.g., KYC, AML, etc.) with tokenization, and, on the other hand, further tamed tokenization’s potential to draw on distinct governance mechanisms in value creation underpinning native cryptocurrencies. The effect of this is to undermine the foundation of techno-financial democratization on which DLT systems tokenization was initially premised, which has enabled the growth of tokenization as a process to techno-financially govern an increasing range of previously illiquid resources (as financial assets).
The convergence between tokenization and securitization entailed a complex and delicate effort from a broad ecosystem of new and established financial actors. One of their central challenges was to reconcile fundamentally distinct – and often contradictory – governance mechanisms: blockchain-based decentralization on the one hand and centralized regulatory frameworks and market logics on the other. This reconciliation is proceeding through the active engineering of the architecture (i.e., the configuration) of financial market infrastructures, technical standards, and regulation (Birch et al., Reference Birch, Komljenovic and Sellar2025). Consequently, much of the issuance of tokenized securities today emphasizes their compliant design, meaning that these tokens are structured from the outset to adhere to the adjusted but still largely traditional legal and regulatory financial market frameworks. This involves embedding legal requirements directly into the token’s characteristics, such as implementing restrictions on who can hold or transfer the token as a security. Adherence to securities laws necessitates measures like preparing a prospectus and utilizing specialized token standards – such as ERC-1400 or ERC-3643 – to embed features like identity verification, transfer controls, and role-based permissions directly into the token’s smart contract. Additionally, establishing compliant custody solutions is essential to safeguard these digital assets.
Yet, this convergence also represents a significant departure from the ICO era during which many crypto assets were configured to avoid applicable legal frameworks, often operating in regulatory gray zones. Indeed, as tokenization is gradually embedded in compliance standards and technologically agnostic laws, contemporary tokenized assets have little in common with native cryptocurrencies and their underlying architectures, even if the latter might still serve as foundational layers. This reconfiguration of assetization architectures not only pushes issuers seeking to comply towards private capital markets – a move that limits capital mobility, transparency, and potential revenues – but also leaves earlier crypto asset investors in limbo, or else pushing them further into zones outside regulatory oversight. This undermines the protective layer of securitization (Guseva, Reference Guseva2022), representing a new dynamic that may well contribute to the downward revision of projected tokenized market capitalization growth.
These growing contradictions, however, are camouflaged by the growing enthusiasm of incumbents for the democratizing potential of tokenized financial instruments, which is instrumental in sidelining the widening gap between the materiality of earlier crypto assets and those of their younger offspring. It remains to be seen to what extent the rapid adoption of tokenization by incumbents and the growing dominance of new and still emerging techno-financial architectures will leave room for individuals to manage, produce, and transfer value from their RWAs without intermediary and centralized supervision.
Acknowledgments
We wish to thank the reviewers and editors for their suggestions. Funding for the research on which this article is based comes from the Social Sciences and Humanities Research Council of Canada (Ref. 435-2023-0704) and the Government of Ontario through the Ontario Research Chair in Science Policy.