In the ongoing debate over the optimal location for AI processing, Microsoft has traditionally leaned towards the cloud. Conversely, major players in the hardware arena—AMD, Intel, and Qualcomm—are championing the idea of bringing AI capabilities directly to personal computers, harnessing the potential of their own processors. The question arises: could this dichotomy lead to a clash of interests?
Surprisingly, Microsoft’s Chief Windows Executive, at AMD’s “Advancing AI” presentation, hinted at a harmonious coexistence of cloud and local AI. The context was set during AMD’s launch of the Ryzen 8040 family of AI-enhanced mobile processors. The importance of this reconciliation becomes evident when considering Microsoft’s significant influence—licensing Windows for millions of machines and offering Microsoft 365 subscriptions to an even larger user base, including 76 million consumer subscribers as of the current third quarter of 2023, with an additional 14 percent growth in the commercial sector.
Microsoft’s revenue juggernaut, Microsoft Azure, is at the heart of their cloud services, and a new addition to their offerings, Microsoft 365 Copilot, is slated to utilize AI to enhance productivity. With a monthly subscription cost of $30, the Copilot tool will rely on Microsoft Azure’s cloud infrastructure, further entrenching the cloud’s role in Microsoft’s ecosystem.
In contrast, AMD and its counterparts advocate for running AI applications directly on local PCs. Highlighting applications like Adobe Photoshop, Lightroom, and BlackMagic’s DaVinci Resolve, which leverage on-chip AI, these chipmakers argue for the efficiency and value of local processing. Microsoft’s own Windows Studio Effects also utilize local AI for tasks such as background blurring and audio filtering. The fear is that shifting AI functions to the cloud could diminish the role and value-addition of chip manufacturers.
Fortunately, Pavan Davuluri, the Corporate Vice President within Microsoft’s Windows and Devices division, outlined a nuanced strategy. He referred to a “hybrid engine,” envisioning a collaborative approach where both cloud and local computing work seamlessly together. According to Davuluri, this approach aims to leverage the benefits of local compute—enhanced privacy, responsiveness, and low latency—while tapping into the power of the cloud for tasks like microphones, models, large datasets, and cross-platform inferencing.
In his own words, Davuluri explained, “It’s really about seamless computing across the cloud and client, bringing together the benefits of local compute…with the power of the cloud.” This hybrid strategy aims to create the best AI experiences on PCs, acknowledging the strengths of both paradigms.
During the discussion, AMD’s CEO, Dr. Lisa Su, shared a light moment with Davuluri about Microsoft’s constant demand for TOPS (trillions of operations per second). Davuluri’s response captured the collaborative spirit, stating, “We will use every TOP you provide.” In conclusion, Su expressed excitement about the evolving landscape, envisioning a future where Windows orchestrates multiple apps and services, functioning as an intelligent agent across devices, maintaining context throughout entire workflows. The partnership between AMD and Windows appears to be shaping a future where AI seamlessly integrates into the Windows ecosystem.