Elon Musk Proposes Massive Distributed AI Computing Network Using Idle Tesla Cars

Published on 30 Oct, 2025, 12:14 PM IST
Updated on 30 Oct, 2025, 12:17 PM IST
Acko Drive Team
ReadTimeIcon
3 min read
Top stories and News
Follow us onfollow-google-news-icon

Share Post

tesla_india_launch_screen_d9f410523e.webp

The computers built into Tesla cars could one day power much more than just autonomous driving.

Tesla CEO Elon Musk thinks the powerful AI-capable computers in today’s cars are spending too much time doing nothing. His latest pitch is to utilise their idle time, turning each car into a connected node, amongst which massive datacentre-class applications can be distributed. AI inference tasks, such as pattern recognition and even generative AI requests are suited to being handled by such an architecture. 

The proposal, which Musk mentioned when speaking on Tesla’s recent Q3 earnings call, has resurfaced online after Musk himself responded to a post on X (formerly Twitter) by Tesla fan Nic Cruz Patane. As cars can take care of their own power, cooling and connectivity requirements, he argues the idea might not be as far-fetched as it seems at first.

With the assumption that there will eventually be tens or hundreds of millions of cars with powerful onboard computers in active use, they will spend significant amounts of time idle. Patane measures each car’s compute capability in terms of power consumption, approximating it at 1 kilowatt per car. That could mean a 100-Gigawatt-scale datacentre worth of compute power is just waiting to be tapped. 

Musk hasn’t yet weighed in on the specifics of implementing such a distributed computing project. It is possible that such a feature could be offered as an opt-in for Tesla buyers in the future, potentially allowing them to earn some money by letting their cars perform AI computation operations when not actively in use. It is not yet clear whether such a network would have any minimum requirements in terms of Tesla hardware, or how this could impact a car’s power requirements and battery health. Data privacy protections would also need to be put in place. 

While Tesla has abandoned its Dojo supercomputing effort, it is currently working on designing its own custom in-house chips, known as AI5 and AI6. According to Musk, they are optimised for low-latency, real-time decision-making for autonomous vehicles, and are expected to be used not only in next-gen Tesla vehicles but also its Optimus robots. The company’s upcoming Cybercab and current robotaxi efforts also require considerable compute resources. 

While AI5 has now slipped to a late 2026 launch target, Musk has projected up to 40X performance gains “by some metrics” compared to the HW4 platform which is currently in use across all Tesla models. The next-gen AI6 chip, which has already been announced, is not likely to be seen in production before 2029. Musk has said he is aiming for an “oversupply” of chips, which can then be diverted into other uses such as datacentres, though he has not announced any plans to sell such chips to third parties

AckoDriveTag IconTags
Tesla
Elon Musk
AI5
AI6
AI

Looking for a new car?

We promise the best car deals and earliest delivery!

Callback Widget Desktop Icon
Elon Musk Proposes Massive Distributed AI Computing Network Using Idle Tesla Cars