
Microsoft wants to run Copilot locally on your PC starting early 2025
Microsoft will bring Phi Silica to the Windows Runtime this quarter as part of Copilot, according to the head of Microsoft’s Windows devices division who spoke Monday at CES 2025 in Las Vegas.
Microsoft unveiled Phi Silica at the Build conference in Seattle last May, demonstrating a small language model (SLM) that is designed to complement the large language model (LLM) running in the cloud. Phi Silica makes it possible to run a local version of Copilot on a Windows PC.
LLMs are generally faster and more accurate than SLMs. However, they need to run in the cloud and may require an expensive subscription for full access. On the other hand, SLMs can run AI chatbots and other AI-driven applications on a local PC, but they are less complex and require NPUs that bring local AI capabilities to PCswhich can ensure privacy and prevent information leakage to the cloud.
Microsoft has said that Windows Recall and other artificial intelligence features will eventually rely on this kind of SLM. Phi Silica uses a 3.3 billion parameter model that Microsoft has fine-tuned for both accuracy and speed, even using a smaller language model.
Pavan Davuluri, Microsoft’s corporate vice president of Windows and devices, appeared on stage at Intel’s CES 2025 presentation to make the announcement.
2025-01-06 17:28:41