Shakeout Coming in AI Chip Start-Up Market in 2020

0
Jan 21, 2020

Seoul, Hong Kong, New Delhi, Beijing, London, Buenos Aires, San Diego

January 20th, 2020

The rapid adoption of AI has spurred the development of a new breed of AI chip with the performance needed to meet the high processing and low power requirements of machine learning and deep learning applications. There are now more than 80 start-up companies around the world developing AI chips for training and inference.

These start-ups are focused on developing chips designed for specific AI workloads and use cases using new processor architectures which address the so-called “memory bottleneck” by adopting a much better balance between memory and on-chip compute. As a result, they hope to deliver dramatic improvements in performance and efficiency compared to today’s CPUs and GPUs.

AI-friendly memories being adopted by start-ups include near-chip memories (DRAM and NAND flash), on-chip memories (SRAM) and emerging NV memories. “Some of these also start-ups use analog memory technologies which promise to reduce power consumption from milliwatts to microwatts” said Gareth Owen, Associate Director of Emerging Technologies at Counterpoint Research, “This is ideal for the always-on inference market, for example, for smart speakers, potentially increasing battery life from days to months. However, there are still many challenges, particularly associated with mass production.”

The current AI landscape is still in flux. Although a host of AI start-ups are developing new hardware architectures and software stacks, client companies do not want to be locked to any specific AI acceleration architecture, particularly with consolidation likely in the next few years. Also, they want to ensure that solutions are scalable.

In the training market, competition will be particularly tough. Start-ups will have to compete against Nvidia as well as home-grown products from potential customers such as Amazon, Alibaba, Baidu, Facebook, Google and Tencent. In addition, the cloud training market will be a smaller market than the inference market.

A small number of vendors, such as Graphcore and Habana Labs, have already launched commercial products, mostly PCI-type plug-in accelerators. However, they are competing against Nvidia, who will be tough to beat for several reasons, but particularly due to its strong software ecosystem. As a result, many start-ups are staying out of Nvidia’s way and focusing on the inference market.

In contrast, the edge-based inference market is still in its infancy with many new players and new approaches. It will be a much bigger market opportunity than the cloud data market, with demand for AI chips in potentially billions of industrial and consumer devices and hence with a wide range of different requirements. Like the server chip vendors, however, edge-based chip start-ups will need to focus on very specific markets differentiated, primarily, by processing capability, power and cost requirements.

Key start-ups in the inference market include Gyrfalcon, Efinix and Syntiant which have already launched commercial products, and Flex Logix, Mythic and Blaize which are sampling and will offer commercial products later this year.

However, these start-ups face their own unique set of challenges. To be successful, they must offer chipsets that are highly scalable and flexible, achieve the right balance between performance and power budget but also feature strong ecosystem support and a comprehensive software stack. Time to market will also be key.

They will also face tough competition from established players such as Intel, Nvidia, Xilinx and Microsoft’s Altera who will be under pressure to acquire emerging winners in order to maintain and increase overall market share. This has already started to happen with Intel’s recent acquisition of Habana Labs for $2 billion.

At present, it is not clear who will be the winners or losers, but the industry will not accommodate many new chip suppliers. A few will get acquired, but most will disappear.

“We believe that there will be consolidation in the market during the next two years, due predominantly to a mismatch between actual performance from working silicon and predicted theoretical performance,” said Peter Richardson, Research Director at Counterpoint Research. “Many of these start-ups have not developed software in tandem with hardware and thus are not able to perform accurate performance modelling, resulting in shortfalls in actual performance achieved.” he added.

Ultimately, the differentiator may not be the company with the best hardware solution but the one with best software/hardware mix. Established players such as Intel and Nvidia will continue to dominate the market landscape due to the robust developer ecosystems they have created around their AI chipsets.

The full report is available for purchase on our research portal. Please feel free to reach out to us at press(at)counterpointresearch.com for further questions regarding our in-depth latest research, insights or press enquiries.

Background:

Counterpoint Technology Market Research is a global research firm specializing in Technology products in the TMT industry. It services major technology firms and financial firms with a mix of monthly reports, customized projects and detailed analysis of the mobile and technology markets. Its key analysts are experts in the industry with an average tenure of 13 years in the high-tech industry.

Analyst Contact:

Gareth Owen

Counterpoint Research
press(at)counterpointresearch.com

Summary

Published

Jan 20, 2020

Author

Gareth Owen

Gareth has been a technology analyst for over 20 years and has compiled research reports and market share/forecast studies on a range of topics, including wireless technologies, AI & computing, automotive, smartphone hardware, sensors and semiconductors, digital broadcasting and satellite communications.

Back To List