Amin Vahdat, Google’s chief technologist for AI infrastructure, described rapid progress in the company’s Gemini model family and outlined the infrastructure challenges and opportunities that come with scaling modern AI systems. Speaking in an onstage conversation, Vahdat emphasized the importance of Alphabet NASDAQ: GOOG operating “full stack,” from custom silicon through data centers and software, and said the pace of AI improvement is being matched by growing demand that quickly consumes efficiency gains.
Gemini momentum and an “early innings” view of the AI race
Vahdat said Gemini 3 has performed at “state-of-the-art across essentially all the benchmarks,” and framed Google’s AI progress as the product of a multi-year push. He noted that Gemini 1 was released “two-ish years ago” and said the company has been on a “three-plus-year journey.”
Despite the progress, Vahdat characterized the competitive landscape as still nascent, calling the broader AI race “inning one.” He also credited the broader market environment, saying that models across the industry—citing Claude, ChatGPT, and Gemini—are improving quickly and that competition is “making everyone better.”
Full-stack co-design: TPUs, models, and product needs
Asked about Google’s advantage as a full-stack company, Vahdat said the key differentiator is cross-functional collaboration rather than any single technology component. While he highlighted Google’s pride in its TPUs, distributed systems, data center architecture, and power delivery, he said the “secret weapon” is the company’s ability to work together across layers to solve end customer problems.
Vahdat said TPUs are “not designed in isolation,” describing a co-design approach that includes DeepMind and input from product use cases such as Search, Ads, YouTube, and Cloud. He added that he works closely with DeepMind CEO Demis Hassabis, with the two speaking regularly and their teams “engag[ing] deeply.”
He also stressed the importance—and difficulty—of long-range planning: both software and hardware choices can carry “two to three year lead times,” requiring teams to make bets about where workloads and models are likely to go.
TPUs on Google Cloud, and specialization versus lead times
On commercialization, Vahdat said TPUs are “a GCP product offering entirely.” At the same time, he emphasized that GPUs are also a significant Google Cloud product area and highlighted a “deep partnership with NVIDIA,” saying “a lot of our success at Google has been as a result of that partnership.” He positioned Google’s approach as workload-driven, aiming to choose the best fit—GPU, TPU, or other accelerators—based on the customer problem.
Vahdat argued that the industry is moving away from “general-purpose, one-size-fits-all architecture” toward specialized systems tuned for particular workloads. He said specialization can deliver major efficiency gains, repeatedly pointing to a “factor of 10” uplift across dimensions such as cost, scale, and power efficiency, while acknowledging the trade-off of reduced generality.
However, he said the pace of specialization is constrained by hardware development cycles. Today, he estimated that moving from a new hardware concept to scaled deployment in data centers takes roughly three years. He said more aggressive specialization would become possible if that cycle time were dramatically reduced; he floated a hypothetical “three months” as a world-changing target, while acknowledging he does not know how to achieve it. He suggested that two years might be achievable and noted that even incremental reductions in lead time could enable more frequent and targeted designs.
Data centers in space: promise and open challenges
Vahdat said Google is “looking into” the idea of computing infrastructure in space, describing it as an area where several companies are evaluating “first principles” benefits. He cited a sun-synchronous orbit as attractive due to potential “24/7 solar power,” with no cloud cover or sunsets and “no need for batteries.”
He also pointed to potential performance and efficiency benefits, stating that space-based deployments could be “30% more efficient” and that inter-satellite networking could offer a “50% reduction in latency” versus routing through fiber, due to speed-of-light path differences.
At the same time, he emphasized that “many, many problems” remain, including cooling and maintenance. He suggested robotics could play a role in addressing maintenance challenges and said the industry’s current deployment and operations model is unlikely to scale to the levels of growth being discussed today. He contrasted early Google data centers—citing roughly 10 megawatts in 2002—with current industry talk of gigawatt- and even 10-gigawatt-scale projects, arguing that a thousand-fold increase in scale forces a rethink of how infrastructure is built and maintained.
When asked about timing for a gigawatt in space, Vahdat said it is “greater than five years away” at that scale and that it is “too early” to put a firm timeframe on it, while still calling it worth pursuing because of the learning and state-of-the-art advances it could drive.
Bottlenecks, memory, and whether efficiency “wins the day”
Vahdat said his primary worries “change by the week,” with “velocity” as a consistent theme related to iteration speed and delivery. He added that energy and supply chain issues are recurring concerns given the pace of growth.
He also noted that memory pricing and availability have been a worry in certain weeks, pointing to the split between DRAM and HBM. Asked about a comment that constraints could last until the end of 2028, Vahdat said the referenced executive would “know more,” adding that he hopes that view is wrong but that it may need to be “pencil[ed] in.”
More broadly, Vahdat challenged the assumption that efficiency improvements alone will alleviate AI’s scaling pressures. While he said Google is investing heavily in software, model, hardware, and power efficiency, he argued that as models become more capable, users do more with them and “instantly consume” the gains. He said efficiencies “will eventually” help, but suggested the relief could come later than some expect.
Looking ahead, Vahdat said he does not see a slowdown in model progress. He compared the current pace to Moore’s Law-era CPU improvements, saying that while it is hard to quantify, the field feels like it is getting “twice as good” every “three to six months,” potentially faster than historical CPU cycles. He added that evaluations are improving as they focus more on real-world, previously difficult cases where models struggled.
Vahdat concluded that AI is not likely to be “winner-takes-all,” describing it as a transformation larger than the internet and an unusually important moment to contribute across the technology stack, particularly in infrastructure that supports new AI services and agents.
About Alphabet NASDAQ: GOOG
Alphabet Inc NASDAQ: GOOG is a multinational technology holding company headquartered in Mountain View, California. Formed in 2015 through a corporate restructuring of Google, Alphabet serves as the parent to Google LLC and a portfolio of businesses collectively known as "Other Bets." Google was originally founded in 1998 by Larry Page and Sergey Brin; Alphabet is led by CEO Sundar Pichai, who oversees Google and the broader company while the founders remain prominent shareholders and influential figures in the company's history.
Alphabet's core business centers on internet search and advertising, with Google Search and the company's ad platforms (including Google Ads and AdSense) generating the majority of revenue by connecting advertisers with consumers worldwide.
Featured Stories
This instant news alert was generated by narrative science technology and financial data from MarketBeat in order to provide readers with the fastest reporting and unbiased coverage. Please send any questions or comments about this story to contact@marketbeat.com.
Before you consider Alphabet, you'll want to hear this.
MarketBeat keeps track of Wall Street's top-rated and best performing research analysts and the stocks they recommend to their clients on a daily basis. MarketBeat has identified the five stocks that top analysts are quietly whispering to their clients to buy now before the broader market catches on... and Alphabet wasn't on the list.
While Alphabet currently has a Buy rating among analysts, top-rated analysts believe these five stocks are better buys.
View The Five Stocks Here

We are about to experience the greatest A.I. boom in stock market history...
Thanks to a pivotal economic catalyst, specific tech stocks will skyrocket just like they did during the "dot com" boom in the 1990s.
That’s why, we’ve hand-selected 7 tiny tech disruptor stocks positioned to surge.
- The first pick is a tiny under-the-radar A.I. stock that's trading for just $3.00. This company already has 98 registered patents for cutting-edge voice and sound recognition technology... And has lined up major partnerships with some of the biggest names in the auto, tech, and music industry... plus many more.
- The second pick presents an affordable avenue to bolster EVs and AI development…. Analysts are calling this stock a “buy” right now and predict a high price target of $19.20, substantially more than its current $6 trading price.
- Our final and favorite pick is generating a brand-new kind of AI. It's believed this tech will be bigger than the current well-known leader in this industry… Analysts predict this innovative tech is gearing up to create a tidal wave of new wealth, fueling a $15.7 TRILLION market boom.
Right now, we’re staring down the barrel of a true once-in-a-lifetime moment. As an investment opportunity, this kind of breakthrough doesn't come along every day.
And the window to get in on the ground-floor — maximizing profit potential from this expected market surge — is closing quickly...
Simply click the link below to get the names and tickers of the 7 small stocks with potential to make investors very, very happy.
Get This Free Report