News

Snapdragon chips will have on-device AI capabilities next year

Published

on

Qualcomm announced that it is scheduled to make available Llama 2-based AI implementation on devices powered by Snapdragon starting from 2024 onwards. This will allow developers and OEMs to offer better AI features using Meta and Microsoft’s Llama 2 language model.

Earlier, Meta issued a press release, announcing its cooperation with Microsoft to jointly launch the open-source large language model Llama 2 to help developers build generative AI applications. After that, Qualcomm announced that it will cooperate with Meta to run applications based on Llama 2 model.

Qualcomm Technologies, Inc. and Meta are working to optimize the execution of Meta’s Llama 2 large language models directly on-device – without relying on the sole use of cloud services.

The ability to run generative AI models like Llama 2 on devices such as smartphones, PCs, VR/AR headsets, and vehicles allows developers to save on cloud costs, and to provide users with private, more reliable, and personalized experiences.

The US chip designer aims to make available on-device Llama 2-based AI implementations to enable the creation of new and exciting AI applications. It will let customers, partners, and developers build intelligent virtual assistants, productivity applications, content creation tools, and more

Exit mobile version