The demand for graphics processing units (GPUs) is outpacing the supply, which is hindering innovation in artificial intelligence (AI), according to Tory Green, the COO of decentralized GPU cloud service provider io.net. Green attributes this mismatch to manufacturers’ inability to quickly build new supplies to meet the growing demand. The compute requirements for machine learning training have increased 10x every 18 months since 2010, while computing power has only doubled during that period. To address this issue, Green suggests using a decentralized physical infrastructure network (DePIN), where computing resources are spread across multiple locations and owned by various entities. This allows for scalability and avoids the limitations of traditional cloud providers. DePINs can aggregate GPU compute power from data centers, crypto miners, and consumer households to provide affordable and flexible computing resources for startups building AI solutions. They offer advantages such as massive computing power, cost efficiency, scalability, higher security and reliability, and accessibility compared to traditional cloud platforms like AWS and Azure. DePINs have the potential to disrupt the current cloud market oligopoly by expanding supply and lowering prices for consumers while increasing profits for GPU suppliers. While DePINs are still in the early stages of adoption in web3 infrastructure, they have significant potential to go mainstream by solving the global shortage of GPU resources and offering a better customer experience at a lower cost. Unlike GPUs, DePINs are unlikely to face scarcity issues due to their decentralized nature, diverse hardware contribution, resource optimization, inherent scalability, geographical distribution, and community-driven solutions. In summary, DePINs provide a solution to the GPU shortage problem and have the ability to drive AI-allied technologies into mainstream adoption.