top of page
Writer's picturePete Harris, Principal, Lighthouse Partners, Inc.

Convergence in Action in 2025 – TradFi and DeFi, Tokenization Everywhere, and Data-First Architectures for Trustworthy and Decentralized AI

Updated: Jan 3


Convergence in Action in 2025

Welcome to 2025, which promises to build on the past 12 months and further accelerate the development and deployment of several transformational technologies that Lighthouse is actively engaging in and guiding customers with.

 

With that in mind, I’ve noted below a few important trends for the coming year that can be characterized as convergence in action, and related to how the TradFi space is embracing DeFi, how tokenization is providing new transaction models for markets, and the imperative to address data governance and provenance to ensure that emerging artificial intelligence (AI) Agents are trusted and operate responsibly.


TradFi Meets DeFi

 

With the growing adoption of both asset tokenization and stablecoins by big (regulated) financial markets players, and a more amenable regulatory framework expected, an accelerated convergence of TradFi and DeFi is a given.TradFi (meaning the Traditional Finance markets running on slow-ish and costly legacy infrastructure) will continue its redux using DeFi (for Decentralized Finance, tapping into blockchain and crypto technologies to support tokenized assets).

 

This does not mean that Wall Street firms are going to embrace the ideals of crypto maximalists, such as leveraging decentralized exchanges and dropping KYC and AML compliance checks. But just as the financial markets establishment has adopted public internet and cloud services for many applications, so the flexibility and cost savings of public blockchains will be a draw to the TradFi world as it looks to tokenize portfolios and funds to improve transaction efficiency.

 

Further adoption of emerging blockchain-based approaches for identity and compliance, such as the popular ERC3643 standard, are likely. Championed by Luxembourg-based Tokeny, ERC3643 has spawned a formal association of more than 100 participants and has engaged in a very effective advocacy, technology architecture, and education program to date.

 

Tokenizing Everything, Everywhere


Given the buzz around tokenization, it might seem that the impetus is to tokenize anything and everything. That view is supported by comments from Larry Fink, CEO of Blackrock, the world’s largest asset manager, who believes that tokenization will underpin "the next generation for markets.” Already, tokenization is being adopted by markets for securities, funds, loans, payment collateral, real estate, art, and more. So, what’s next?

 

In 2025, expect increasing interest in tokenizing IT infrastructure – compute, storage, networking – to support then needs of Web3 applications and artificial intelligence (AI) as it taps into all available resources to power model training and inference.

 

This tokenized DePIN (Decentralized Physical Infrastructure Network) approach will enable fractional ownership and usage of infrastructure, with an audit trail for management and monetization that will greatly improve the economics of deploying new AI reasoning models and AI Agents at scale.

 

A Tokenized Data-First Architecture for AI


For AI, tokenization will offer more than cost effective infrastructure. Because it allows ownership and usage of fractional units of an asset to be managed using blockchain technology, a time-ordered audit trail can be constructed for assets, including for data objects that are processed by data-first architectures.


Data objects for AI include data inputs and outputs of models, and even the models themselves. By tokenizing data objects, a complete record of provenance can support trustworthy AI approaches where data outputs from models can be mapped directly to data inputs, and where updates to a model’s logic can be logged.

 

Moreover, tokenization coupled with emerging privacy enhancing technologies (PETs), such as zero knowledge proofs and homomorphic encryption, lends itself to allowing confidential and private data to be leveraged in transactions involving multiple participants.

 

Toward Decentralized AI

 

This traceability of inputs, output and models coupled with confidential computing will underpin the emergence of AI Agents – autonomous applications that can perform complex tasks – which much of the AI-obsessed IT industry is proposing as the next big leap in practical AI utility.

 

The major IT cloud platforms – such as Microsoft, Google, Amazon, and Meta – are looking to AI Agents to generate significant and sustainable revenues from AI, which have thus far eluded them. Much will depend on whether they can deliver cost-effective, useful, and competent AI Agents, but also on whether users trust the services provided by Big Tech.

 

But AI Agents are also readily implemented as open-source and Decentralized AI applications that can be delivered at scale in ways that promote provable trust. Given the increasing acceptance of decentralization approaches, the time might be near for democracy in AI to prevail. I expect that to begin in 2025!

33 views0 comments

Comments


bottom of page