Architecture and Hardware

Anticipating the Year of the AI PC

Personal computers increasingly will incorporate neural processing units into upcoming models, parlaying interest in artificial intelligence.

3D processors and circuit board, illustration

As artificial intelligence (AI) continues its frenzied pace in the enterprise, PC vendors and chipmakers are responding swiftly, embedding hardware with an AI chip and algorithms that aim to improve AI use cases.

Right now, “Because there is still no killer app for the average information worker,” said a recent report from Forrester Research, 2024 is the year of transitioning to AI PCs, with early adopters experimenting with them, but 2025 will be “the year of the AI PC.”

Unlike standard PCs, AI PCs have dedicated AI chips and algorithms embedded across the central processing unit (CPU), graphics processing unit (GPU), and neural processing unit (NPU), according to Forrester.

The major difference is that third chip, the NPU, since CPUs and GPUs are common chips that handle standard compute tasks, explained Forrester principal analyst Andrew Hewitt, one of the report’s authors. “The NPU brings in two things: it allows any AI workload to be optimized across the three chipsets, and has intelligence built into it,” he said.

The second differentiator is that AI PCs have certain capabilities that are enabled by the NPU. “The new NPU is a specialized accelerator that handles AI and machine learning tasks right on your PC as opposed to sending data to be processed in the cloud,’’ said Robert Hallock, vice president and general manager of AI and technical marketing at Intel. “The GPU and CPU can also process these workloads, but the NPU is especially good at low-power AI calculations.” 

Intel introduced an integrated NPU as part of its Intel Core Ultra mobile processors (codenamed Meteor Lake) that powered AI PCs launched in December 2023, according to Hallock.

Currently, there are “only a handful” of things that can be done on an AI PC. One example is Microsoft’s new AI special effects capability, Windows Studio Effects, which uses the NPU exclusively, Hewitt said.

As AI workloads become more prevalent, an AI PC will improve their performance, he said. Echoing Hallock, Hewitt added that “Most importantly, an AI PC will allow you to run generative AI and large language models directly on the device without having to go the cloud.’’ The NPU will improve performance locally, so work can be done offline, if on a plane, for example, and someone is trying to use Microsoft CoPilot. This also protects a user’s privacy, he said.

However, there are caveats for using AI PCs that IT organizations will need to consider. While the Forrester report noted that “the user experience improvements of the AI PC are important,” it points out that “what the industry is forgetting is that user experience improvements almost never change IT purchasing behavior. Cost, security, privacy, and the upcoming Windows 10 [end of life] will be primary drivers of AI PC adoption, with the added bonus of a much-improved user experience.” 

AMD, Dell, HP, Lenovo, Intel, and Nvidia all announced AI PCs in January at the Consumer Technology Association’s CES 2024. In May, Apple unveiled the AI-powered iPad Pro, and Microsoft rolled out Copilot + PCs.

The demand for AI PCs is being driven by several factors, including the increasing use of AI to streamline operations, enhance productivity, and drive innovation, said Alex Thatcher, senior director of AI experiences and cloud clients at HP.

Echoing Forrester’s Hewitt, Thatcher said, “Right now, a lot of these AI-based capabilities are enabled in the cloud, but increasingly, the desire for faster response times and the need to keep private data secure and out of cloud environments is creating demand for a new kind of PC that is purpose-built to run AI workloads locally by adding a neural processing unit to the system.”

AI-powered PCs have advanced encryption and authentication features, and enterprises are prioritizing systems that can safeguard sensitive information and mitigate cybersecurity risks effectively, he said.  

Further, the NPUs are capable of more than 40 trillion AI operations per second, allowing them to run language models and diffusion models for images, video, audio, and generative AI right on the device, Thatcher noted.  

The NPU chip “enables models like natural language processing, image recognition, and predictive analytics to run on local silicon that uses a fraction of the power so enterprises can tackle complex tasks with ease,” he said.

Most of the “game-changing” use cases are in the creative domain, said Hewitt, adding that he has used an AI PC for music production to cut out specific instruments from a mix or to incorporate an accompanying drumline. AI PCs also can be used for photo and video editing to isolate a particular image, for example, and in podcasting production to clone someone else’s voice, he sa

Hewitt is also interested in using AI PCs for speech training. “If I’m giving a speech, being able to practice it and have AI give me criticism as I’m speaking’’ is a great feature, he said. “Having that speech coach built into your device” is a use case that is “broadly applicable to the enterprise.”

He reiterated that the major problem with AI PCs today is that there is “a growing but fairly small ecosystem of applications,” and there will need to be broader use cases to drive enterprise adoption.

Thatcher agrees with Forrester’s premise that 2024 will be the year of early AI PC adopters and many pilots, “and 2025 will quickly drive AI PCs to a much broader, mainstream offering.” The benefits will come at a hefty cost—HP’s Elite AI PC series currently starts at $1,606, with some models in the high $3,000 range.  

IT solutions integrator Insight Enterprises is advising clients to start investing in AI PCs “to some degree so that they can get familiar with this new class of devices,’’ said Juan Orlandini, the company’s CTO. Organizations should be “deliberate in their investigation,’’ he added, noting that AI PCs should be tested in a controlled IT-managed environment or “ideally, by seeding these devices into their most power-hungry and/or demanding users.” This will give the business a good sense of the value of these devices, Orlandini said.

During his keynote at Microsoft Build 2024, Microsoft CEO Satya Nadella discussed the new Copilot + PCs, but “glossed over” the big picture, observed Constellation Research in a recent email news brief. While CIOs are going to start testing these Copilot + PCs and pondering their refresh cycles, for AI PCs to scale, “you’re going to have to have a lot of computing power at the edge that can be used for inferencing and potentially even training,’’ the firm wrote.

However, with AI at the forefront of how work will be transformed, there is little doubt AI PCs will make a splash. “People are going to be amazed at the ability they will have to do things in just a few seconds that could have taken hours or days before—if they were even possible to do at all,” Thatcher noted, adding that those ‘aha moments’ will be the source of growing demand for AI capabilities and the PCs that enable them.’’  

Esther Shein is a freelance technology and business writer based in the Boston area.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More