top of page
Writer's picturePete Harris, Principal, Lighthouse Partners, Inc.

How DecentraTech Can Reduce Costs for Artificial Intelligence – Part 1: Decentralized Compute

It’s no secret that the popularity of artificial intelligence (AI) has exploded for both business and consumer applications. Along with that trend, public cloud platforms from Amazon Web Services, Microsoft, and Google have upgraded with the latest GPU chips to power AI apps.

While not the only enabler of success, the winners in the world of AI will likely be the companies that can amass substantial compute power to underpin their services.But for now there are a couple of issues with that endeavor.


Firstly, the demand for the most powerful GPU chips is outstripping supply. And those chips are pricey. For example, the most powerful H100 chips from GPU leader Nvidia sell for around $40,000, while renting time on a GPU at a cloud provider costs about 100 bucks an hour. No wonder then that some AI startups are raising billions of dollars in funding simply to pay for the compute infrastructure that they will need to operate. 


One example is Coreweave, a specialist AI cloud provider, which recently closed on $7.5 billion in debt financing to double its datacenter capacity. Meanwhile, leading crypto VC firm Andreessen Horowitz is building GPU server clusters for companies to use. It expects to host some 20,000 GPUs as part of this initiative, known as oxygen, which it hopes to use as a competitive tool to lure startups to its portfolio. Other companies, including Elon Musk's xAI and Meta are rolling out AI clusters with 100,000 GPUs.

 

Clearly, not many startups (or even enterprises for that matter) have the funding to be able to throw masses of GPUs at their AI endeavors. Which is where DecentraTech approaches – specifically leveraging Decentralized Physical Infrastructure Networks (or DePIN) – might present a path forward.

 

Unlike traditional, centralized datacenters, which are typically built by single companies, such as Microsoft (which by the way expects to spend $50 billion on new datacenters for AI), DePINs leverage blockchain technology to decentralize control, ownership and the cost of building and maintaining physical infrastructure.


In a DePIN model, this infrastructure is provided by large numbers of entities, including startups, small/medium enterprises, communities and individuals which make it available (often part of the time or as a background task when it is underutilized for its primary use) to the DePIN operator.


For compute services, this generally requires the providers to install or run a software process on their workstation or server to register and manage their participation. In return, the providers are rewarded, generally in an operator-specific cryptocurrency.


Note: for more on DePINs in general, see this blog from Multicoin Capital.


Assuming enough providers can be appropriately incentivized to offer their infrastructure to the operator, and the DePIN design offers easy integration for providers and standards-based access for end-user applications, the DePIN approach can provide massive quantities of compute power, at a fraction of the cost of traditional cloud services.


While DePIN compute offerings began offering generic CPU power, the rise of AI applications has led a number of DePIN operators to offer GPUs as well, while others specialize in GPU compute specifically for AI. See below for some examples of AI-oriented DePIN operators.


Akash – established way back on 2015 and with plenty of experience in decentralized compute, last year it began to roll out various flavors of GPU. Its AKT token is built on Cosmos, a blockchain ecosystem with a mission to establish the 'Internet of Blockchains' by enabling secure communication and interoperability amongst various blockchains.


Influx Technologies (Flux) – a decentralized infrastructure provider that began life in 2020. It’s recent FluxEdge offering provides access to a range of GPUs and is targeted at AI applications.


GAIMIN – is actually positioned as a gaming network that rewards its users to play games in return for tapping into the GPUs in their PCs to power applications, including for AI.


IO Research – a decentralized GPU network aligned to the Solana blockchain, originally focused on financial trading but which refocused on the AI space.

 

The Render Network – focused on rendering of 3D graphics for the entertainment industry, Render allows participants in its GPU network to make available unused compute for AI applications.


Decentralized compute is not the only way that DePINs can reduce costs for AI. As well as compute power, AI models generally need to have access to large volumes of data for training. So DePIN for decentralized storage of that data is potentially of interest too. That technology will be covered in Part 2 of this blog.

 

57 views0 comments

Comments


bottom of page