Qualcomm's AI Day came with a slew of new launches and announcements, all focused on expanding the company’s reach and AI strategy. Among other things, the company debuted their newest Snapdragon 655, 730, and 730G chips, gave partners such as Facebook and Microsoft a chance to present their AI strategy, and introduced its AI Cloud 100 Family.
Deemed a soft launch rather than a full-on product announcement, the AI Cloud 100 Family debut signifies Qualcomm’s ambition to move deeper into the AI space. Qualcomm recognizes that with the advancement of 5G and the proliferation of devices with AI capabilities, the market for products that satisfy the growing data center inference space presents a big opportunity. While NVIDIA is the clear leader in the data training market, the data center inference market is more open at the moment. Intel is overwhelmingly dominant in data center CPUs and has a myriad of new products coming. They are well-positioned as most data center inferencing can be done on existing CPUs today. Other competitors include Xilinx, Huawei (Ascend 910 available Q2 2019), AMD, Amazon (Inferentia) and Google’s TPUs. Well-funded start-ups such as Graphcore, Wave Computing, Habana Labs, and others are also throwing their hats in the ring.
Qualcomm AI Day: Cloud AI 100
As a relative newcomer to this space, Qualcomm is betting on its strengths. It has a long history in delivering high performance and low powered computing products as well as being a leader in process nodes. Qualcomm is also targeting the inference market over the training market with their new AI Cloud 100 Family product line as it sees that space growing faster in the future. However, it will need to be careful not to repeat the mistakes it made with the Centriq data center processor two years ago. Back then also, it tried to target a specific niche in order to get a foot in the door but with limited success. Since then, Qualcomm decided to move on and go after higher return-on-investment (ROI) opportunities.
Qualcomm's announcement left out many of the details including specs, pricing, and even any sort of comparison charts in terms of cost savings or AI capabilities. The only data points that were mentioned referred to the Snapdragon 855 and how the AI Cloud 100 has >50x more AI capabilities than its newest mobile chip. It also claimed that AI Cloud 100 is able to hit more than 100 TOPS. The Snapdragon 855 does 7 TOPS (but across CPU, GPU, and DSPs), so that gives the AI Cloud 100 Family up to 350 TOPS theoretically. At the moment, this seems dubious. We do not even know what kind of processing it is doing (INT8 or INT4). As an example, NVIDIA just launched its new Tesla T4 chip last October which is designed specifically for data center inference. The Tesla T4 is quoted at 130 TOPS for INT8 but double that for INT4. Overall, Qualcomm claim that its products will be able to be 10 times faster on average in inference tasks. As a strict inference ASIC solution, Qualcomm is optimizing AI Cloud 100 to be fully dedicated to AI inferencing. This would allow data centers to save on costs over other general CPU/GPU/DSP products doing the inferencing.
Qualcomm will begin sampling the AI Cloud 100 Family series in H2 2019, with the final products launching in 2020. It is still too early to tell what kind of impact the AI Cloud 100 Family lineup will have in the market given the lack of information on its benefits. The inference market is still being dominated by Intel’s CPUs. Therefore, it will become very important for Qualcomm to launch with a prominent roster of clients in 2020 to show that it has the capabilities to succeed in this market. Given the limited operators in the data center inference market, it will be a difficult battle for them. Most important, will be their chip specification. Higher TOPS is one thing, but just as important is TOPS/Watt. Power efficiency is critical not just for small battery-powered devices but also for data centers as they consume massive amounts of power. In summary, Qualcomm has its work cut out but also has the opportunity to wow the public if it can get this right.
Related Research
May 25, 2022