Qualcomm knows that if it wants developers to build and optimize AI applications across its portfolio of silicon, the Snapdragon giant needs to make the experience simpler and, ideally, better than what its rivals have been cooking up in the software stack department.
That’s why on Wednesday the fabless chip designer introduced what it’s calling the Qualcomm AI Stack, which aims to, among other things, let developers take AI models they’ve developed for one device type, let’s say smartphones, and easily adapt them for another, like PCs. This stack is only for devices powered by Qualcomm’s system-on-chips, be they in laptops, cellphones, car entertainment, or something else.
While Qualcomm is best known for its mobile Arm-based Snapdragon chips that power many Android phones, the chip house is hoping to grow into other markets, such as personal computers, the Internet of Things, and automotive. This expansion means Qualcomm is competing with the likes of Apple, Intel, Nvidia, AMD, and others, on a much larger battlefield.
Given that all these players agree AI is a key future workload for everything from cars and IoT devices to PCs and smartphones, Qualcomm is hoping to stand out with a unified software stack, which brings together all the company’s existing tools, frameworks, runtimes and other kinds of software for developing AI applications on its families of processors.
Qualcomm thinks all this will cover AI development for everything from phones and PCs to cars and datacenters. Click to enlarge.
In a briefing with journalists, Qualcomm executive Ziad Asghar said the consolidation of the company’s AI software efforts falls in line with how it has been taking its core chip technology and adapting it to different markets with customization.
“So an investment becomes a real positive advantage for us because now the investment that we make in one area is actually able to propagate very strongly into all the different areas,” said Asghar, who is vice president of product management at Qualcomm Technologies.
Opening up the toolkit
With Qualcomm AI Stack, the company is now hoping to bring this benefit of using one set of technologies for multiple market areas to developers and device makers on the software side.
By cutting down on the amount of work required to adapt AI applications from one type of Qualcomm chip to another, the biz is promising to save companies money on expensive engineering resources.
“We are giving the ability to our OEM base to be able to do that to great extent without actually having to spend a lot more in terms of [non-recurring engineering],” Asghar said.
What sets Qualcomm AI Stack apart from competing software stacks, according Asghar, is the Qualcomm AI Model Efficiency Toolkit. One of its most significant features is the ability to compress a power-hungry AI model trained in the cloud from 32-bit floating point precision to the 8-bit integer format so that it can run well on low-power devices.
“As you do that, you are able to bring huge benefits in terms of power consumption, up to four times power consumption improvement in many cases in scenarios like this,” he said.
Qualcomm is also banking on its Neural Architecture Search tool, which lets developers optimize AI models using various constraints, such as greater accuracy, lower latency, or lower power. This will allow developers to, Asghar said, “get a lot more accuracy or a model that is able to give you lower latency or a model that’s able to do the same task at lower power.”
While much of the underlying software in the Qualcomm AI Stack isn’t new, Asghar called the unification of the elements a “leap forward” for developers and device makers and said it will serve as the foundation for domain-specific software development kits when future markets take shape.
“What we envision is that now with this very solid footing, which is unified across all the different business lines, we can start to write more of these domain-specific SDKs as a new vertical emerges where we see an agglomeration of our customers,” he said. ®