Privately managed assets have become the default choice for tokenisation enthusiasts. The reasoning behind their choice is hard to fault. The advertised benefits of tokenisation apply a fortiori to the asset class. Privately managed assets – especially hedge funds and real estate, but also equity issuers that might previously have used the private placement or crowd-funding markets to raise capital – account for a disproportionate share of token issues so far. But are the enthusiasts and the pioneering issuers missing the point of tokenisation?
The latest McKinsey research study puts the value of privately managed assets at US$9.8 trillion. Though a trivial sum by comparison with the public bond (US$127 trillion), equity (US$124 trillion) and fund (US$64 trillion) markets, it is large enough to have attracted primary and secondary token market aspirants in all parts of the world and a string of supportive technology vendors.
An informal review by Future of Finance of token market data sources and of eight active token platforms identified less than 100 token issues, but almost all of them can be classified as tokenisations of privately managed equity, debt, funds, real estate or alternative assets. Which suggests that tokenisation is establishing itself as an alternative to private placements and crowd-fundings.
If so, it could grow fast, for the McKinsey study under-estimates the size of the opportunity. It excludes hedge funds, which are currently managing at least another US$4 trillion, and picks up less than half of one per cent of a global real estate market worth 30 times the total assets under management it does capture. The commodity and fine wine and fine art markets do not feature either.
Intriguingly, privately managed assets are now also being pursued by established intermediaries as well as tokenising start-ups. A recent survey of 148 banks, stock exchanges and central securities depositories (CSDs), published by the International Securities Services Association (ISSA) put private debt and equity behind only bonds and cryptocurrencies in terms of live deployments.
The survey also found that, in terms of benefits captured, private equity and debt were ahead of bonds. Which helps explain why the Depository Trust and Clearing Corporation (DTCC) is launching a Digital Securities Management (DSM) platform to provide an issuance, distribution, trading and settlement platform for privately managed assets in tokenised as well as traditional form.
At least one token exchange is already working with the DTCC on grounds investors will find the involvement of the American central securities depository (CSD) – which has long provided comparable services to public securities – reassuring. However, the presence of the DTCC suggests the tokenisation infrastructure that has evolved so far is not solid enough to attract institutional money.
To date, sales campaigns have focused on the benefits of tokenisation. The list has become wearily familiar: financialisation (value now for future flows); democratisation (access to institution-only asset classes); better price discovery and transparency (elimination of rent-seeking intermediaries); liquidity (for previously illiquid assets); automated compliance; and lower transaction and asset-servicing costs.
Markets and especially market data (about values) are currently too limited to assess whether these benefits are being realised. Which means issuers and investors lack the powerful return-on-investment arguments they need to persuade their superiors to take the risk of migrating away from the status quo and supporting their efforts with big budgets and talented personnel.
In the absence of issuers and investors, the infrastructure needed to support the tokenisation of privately managed assets is developing haphazardly. There are more issuance and trading platforms than business to share, offering different combinations of technology and primary and secondary market and post-trade services. Token platforms that specialise in an asset class are proliferating.
Token “standards” are proliferating too, giving new life to the old joke that the only problem with standards is that there are too many to choose from. This hampers inter-operability between assets issued on to different blockchain protocols, exacerbating the complexities created by the variation in the regulatory treatment of tokens and tokenised assets in different jurisdictions.
The same technical obstacles are raised with deadening regularity. Blockchain technologies are slow and unscalable and cannot interact with legacy systems and financial market infrastructures. Smart contracts are vulnerable to hackers. Laws need to be changed. Regulation needs not only to catch up but to avoid unforced errors, like insisting intermediaries put crypto-assets on the balance sheet.
All this noise is obscuring the prize, which is not an upgrade or an enhancement of the existing system or even a migration away from it. Tokenisation is an entirely new way of raising and investing capital, in which the decisive innovation will be not the tokenisation of privately managed assets already in existence, but the development of an entirely “native” alternative.
The question is who can make this happen. Legislators and regulators can help by re-writing laws and regulations, and central banks by putting fiat currency on-chain. But the institutions which can do the most are the investment banks (to structure issues on behalf of issuers) and the custodian banks (to reassure investors their assets are safe). Where are they and what are they doing?
 McKinsey & Company, Private markets rally to new highs, McKinsey Global Private Markets Review 2022, page 8 .
 2022 SIFMA Capital Markets Fact Book, page 10.
 Investment Company Institute, Worldwide Regulated Open-End Fund Assets and Flows Second Quarter 2022.
 ISSA with the Value Exchange, DLT in the Real World, 2022.
If tokenisation of privately managed assets takes off, issuers – including asset managers – will realise value now for flows in the distant future. They will also cut the cost of raising and servicing capital. Retail investors will gain access to investments previously reserved for institutional investors only and cede less value to intermediaries. Service providers that adapt and innovate (and especially those that can plug the data gap) will prosper. Of course, none of this is certain to happen. But every individual working in private equity or debt, real estate, hedge funds, infrastructure, auction houses or commodity markets needs to understand what will happen to them if it does.
Registration for event is not open yet
What topics will be discussed?
- Have privately managed assets emerged as a tokenisation opportunity for real reasons or faute de mieux?
- How big is the tokenisation opportunity in privately managed assets?
- Which privately managed asset classes are likely to tokenise first?
- Who are the issuers and investors – and where are they?
- Are the promised benefits of tokenisation being realised by the pioneers?
- Does the RoI data exist to support a business case for tokenisation?
- Is the shortage of accurate, timely and independent information about asset values being solved?
- Is the token infrastructure too fragmented (and conflicted) to encourage issuance and investment?
- Does the market need specialist or generalist token issuance platforms?
- Is public-versus-private still an issue in privately managed asset tokenisation?
- How many functions – primary market, secondary market, custody – can a token platform safely perform?
- To what extent are differences in regulations at the national level hampering the growth of tokenisation?
- Are market participants wasting energy by tokenising existing assets rather than issuing “native” tokens?
- Why is the DTCC building a market infrastructure for privately managed assets and what services will it supply?
- Are regulators behaving counter-productively?
- Why are the investment and custodian banks approaching the opportunity so cautiously?
- Are tokenisers inside conventional financial institutions getting the senior management support, budgets and personnel they need to make tokenisation happen?