The emerging landscape of AI-enabled computers will transform the way we work with our devices, whether desktops or laptops. This new technology has positive and negative aspects, and before embarking on an adoption program, it is important to know the facts.
What is an AI PC?
AI-enabled PCs feature a dedicated neural processing unit (NPU) within the system-on-chip (SoC) that handles AI applications, experiences, and the technology that runs the computing power, whether language models, focused tasks, security, or privacy. One of its greatest benefits is that it offers low latency and allows greater customization that meets the growing need for greater autonomy.
AI PC is a new variant of edge computing, where computation is performed close to the data source or close to the end user, rather than relying solely on the cloud. That mixed approach combines the strength of the cloud for intensive tasks and the speed and privacy advantages of local processing. AI PCs demonstrate this by using local hardware such as GPUs and NPUs for AI tasks, reducing latency, saving bandwidth, and improving data security by reducing the amount of sensitive data sent to the cloud. The overall effect is a better user experience, supporting a variety of real-time analytics and AI development.
Regional Director – United Kingdom and Ireland, Kingston Technology EMEA.
Analyst evaluation
With industry analysts calling 2024 the year of AI PCs, it’s interesting to look at the landscape through the analysts’ lens. Gartner, for example, predicts that 54.4 million AI PCs will ship this year, while IDC says it will be 50 million, and Canalys uses a slightly different measure but believes that 1 in 5 shipments will be AI PCs. Looking ahead to 2025, Gartner estimates that 43% of all AI PC shipments will be AI PCs, but both IDC and Canalys predict that by 2027, that number will have risen to 60%. This represents a definitive market shift in the direction of AI PCs.
AI PC Chipset Progress
The evolution of AI PCs has been reliant on bringing together the hardware and processor to support PC-level AI applications and an early example of this system-on-chip approach was the iPhone with the A11 Bionic processor. Now, with the introduction of chiplets like the Intel Ultra Core processor, we have seen a new CPU design that is tailored to a variety of purposes. Instead of the traditional block CPU, we now have a tile-based CPU that allows a file to be mapped to the GPU (the compute tile for the processor) and the SoC (which includes the NPU) to support the AI engine. Chip manufacturers are now developing and releasing their solutions that enable AI PCs to become a realistic prospect for users.
Importance of combining CPU, GPU and NPU
Modern computing tasks require many different computational capabilities that are best served by the combination of CPU, GPU, and NPU. The CPU is the central processing unit, a general-purpose processor designed for sequential processing, running the operating system and mainstream applications we all like to use on our laptops. The GPU is the graphics processing unit, originally created for graphics rendering. It is equally effective at parallel calculations, ideal for the kind of matrix and vector operations that are essential for AI and deep learning. The NPU is the neural processing unit, a specialized processor developed specifically for AI tasks. The NPU efficiently accelerates neural network calculations while maintaining low power consumption.
This triumvirate enables flexible computing where each type of processor can be used for specific tasks, resulting in significant improvements in performance and power efficiency. And they are not only designed for PCs and laptops. CPUs, GPUs, NPUs, and system-on-chip, which includes all three components, enable an increasing number of devices, including smartphones and embedded systems in sectors such as manufacturing, to harness the potential of AI.
Where do memory and storage fit in?
One of the biggest challenges to confidently adopting AI PCs is dealing with the lack of information and myths about how much memory is needed to run laptops and PCs with AI PC chips. Currently, there are no minimum specifications and it is common for systems to have 8, 16 and 32 Gb of memory. However, as applications become more developed and intelligent uses of AI-enabled PCs become more demanding, we anticipate a change in memory requirements.
The same applies to storage. Some systems have 256 Gb of SSD storage, while others have 1 TB or 2 TB. It's important to think beyond your current needs, or even next year and beyond, and anticipate what future applications might demand and what your storage and memory requirements will be.
Current use cases
Examples of where AI PC is used are increasing day by day. In business productivity, Microsoft Copilot is breaking new ground, but equally popular are solutions like Zoom, Webex and Slack for project management. Jasper is a popular sales and marketing tool, while the Adobe suite is ideal for creative and media tasks, Audacity for audio and GIMP for creative design.
These tools clearly focus on communications and creativity, and reflect the early stages of AI integration. These are high-demand applications and are an obvious entry point for the benefits of AI, as it makes an immediate difference in collaboration and content creation. For many users, the initial approach involves using AI PCs, but not in isolation – with cloud-based AI counterparts remaining part of the mix. As the autonomy and security benefits of AI PC applications on local servers become more important, this balance will shift.
As the landscape develops, technology will become more advanced and accessible, and applications will diversify greatly. We should view current areas of interest as a testing ground for AI capabilities in terms of user acceptance. There will be a learning curve as users embrace AI, but during this, the foundations of AI are being laid across multiple industries and use cases.
Why local is good
The biggest advantage of running AI models on AI computers is that all processing is local, which increases security and privacy and allows users to avoid the risks of moving or storing sensitive data in the cloud (or sending it to public AI models). AI computers have the potential to reduce the chances of data breaches or unauthorized access, and will ensure greater control over data protection regulations such as GDPR simply by keeping data on-site.
Additionally, locally operated models are more resilient to network-related issues, ensuring that essential AI functionality remains accessible even if cloud services fail due to connectivity issues or targeted cyberattacks on infrastructure. Cloud.
Of course, local AI devices will still need strong security measures to protect against local cyber threats such as malware or physical tampering. A comprehensive approach is needed to protect model training, data encryption, proper access control, and continuous monitoring for potential threats.
Preparing for change
Before making the decision to migrate to PCs with AI, first consider what your organization needs now, what is available today to meet that need, what applications are needed to fit specific job functions, and where you are in the upgrade cycle. . If, for example, you are prepared to be an early adopter with full knowledge that AI-enabled PC applications are currently limited but tailored to your needs, you are well positioned to make the transition. However, if you're unlikely to upgrade for another 3 or 4 years, it might be worth waiting until the technology and apps have evolved further.
Staying on top of AI PC chipsets from key manufacturers like AMD and Intel and understanding how storage is evolving to keep pace (DDR4 versus DDR5, for example) will help you identify the right time to adopt AI PCs in terms of applications, performance, and cost.
Another important factor is internal preparation. Staff must be trained to fully optimize AI systems on PCs and operate them in a cyber-secure environment. AI technology is changing rapidly and its adoption requires a comprehensive strategy. One of the biggest challenges right now is the lack of trained professionals who understand the implications of AI from all perspectives. Rather than rushing to manage AI compliance once AI on PC is adopted, the best approach is to be aware of the policies and practices that will be required beforehand and understand the resources needed internally.
one last word
As is always the case with a new technology, there are subtle trade-offs involved in the opportunities and risks of its adoption. Early adopters who can benefit from the AI PC applications that are currently available could have first-mover advantages over their competition. Other organizations will want to ensure they have the right systems and policies in place to support AI PC adoption. It is also worth considering a more nuanced approach and “buying” time by upgrading key components (as this technology is evolving rapidly) rather than making a full commitment today and changing everything at once.
But if you buy an AI PC today, by ensuring you can upgrade your storage or memory in the future, your hardware will be better equipped to run AI PC applications that can work in tandem with existing AI applications in the cloud.
We list the best business computers.
This article was produced as part of TechRadarPro's Expert Insights channel, where we showcase the brightest and brightest minds in the tech industry today. The views expressed here are those of the author, and not necessarily those of TechRadarPro or Future plc. If you're interested in contributing, find out more here: