Bulletin
Investor Alert

Outside the Box Archives | Email alerts

Dec. 3, 2019, 10:29 a.m. EST

Nvidia will dominate this crucial part of the AI market for at least the next two years

Inferencing will be the biggest driver of growth in artificial intelligence

new
Watchlist Relevance
LEARN MORE

Want to see how this story relates to your watchlist?

Just add items to create a watchlist now:

  • X
    NVIDIA Corp. (NVDA)
  • X
    Advanced Micro Devices Inc. (AMD)
  • X
    Intel Corp. (INTC)

or Cancel Already have a watchlist? Log In

By Daniel Newman


Bloomberg News
Nvdiia CEO Jensen Huang earlier this year.

The principal tasks of artificial intelligence (AI) are training and inferencing. The former is a data-intensive process to prepare AI models for production applications. Training an AI model ensures that it can perform its designated inferencing task—such as recognizing faces or understanding human speech—accurately and in an automated fashion.

Inferencing is big business and is set to become the biggest driver of growth in AI. McKinsey has predicted that the opportunity for AI inferencing hardware in the data center will be twice that of AI training hardware by 2025 ($9 billion to 10 billion vs. $4 billion to $5 billion today). In edge device deployments, the market for inferencing will be three times as large as for training by that same year.

For the overall AI market, the market for deep-learning chipsets will increase from $1.6 billion in 2017 to $66.3 billion by 2025, according to Tractica forecasts .

I believe Nvidia /zigman2/quotes/200467500/composite NVDA -2.82% will realize better-than-expected growth due to its early lead in AI inferencing hardware accelerator chips. That lead should last for at least the next two years, given industry growth and the company’s current product mix and positioning.

Nvidia’s AI business at a glance

In most server- and cloud-based applications of machine learning, deep learning and natural language processing, the graphics processing unit, or GPU, is the predominant chip architecture used for both training and inferencing. A GPU is a programmable processor designed to quickly render high-resolution images and video, originally used for gaming.

Nvidia’s biggest strength — and arguably its largest competitive vulnerability — lies in its core chipset technology. Its GPUs have been optimized primarily for high-volume, high-speed training of AI models, though they also are used for inferencing in most server-based machine learning applications. Today, that GPU technology is a significant competitive differentiator in the AI inferencing market.

Liftr Cloud Insights has estimated that the top four clouds in May 2019 deployed Nvidia GPUs in 97.4% of their infrastructure-as-a-service compute instance types with dedicated accelerators.

GPU vs. CPU: Nvidia’s challenge and opportunity

While GPUs have a stronghold on training and much of the server based inference, for edge-based inferencing, CPUs rule.

What’s the difference between GPUs and CPUs? In simple terms, a CPU is the brains of the computer and a GPU acts as a specialized microprocessor. A CPU can handle multiple tasks, and a GPU can handle a few tasks very quickly. CPUs currently dominate in adoption. In fact, McKinsey projects that CPUs will account for 50% of AI inferencing demand in 2025, with ASICs, which are custom chips designed for specific activities, at 40% and GPUs and other architectures picking up the rest.

The challenge: While Nvidia’s GPUs are extremely capable for handling AI’s most resource-intensive inferencing tasks in the cloud and server platforms, GPUs are not as cost-effective for automating inferencing within mobile, IoT, and other “edge computing” uses.

Various non-GPU technologies—including CPUs, ASICs, FPGAs, and various neural network processing units—have performance, cost, and power-efficiency advantages over GPUs in many edge-based inferencing scenarios, such as autonomous vehicles and robotics.

The opportunity: The company no doubt recognizes the much larger opportunity resides in inferencing chips and other components optimized for deployment in edge devices. But it has its work cut out to enhance or augment its current offerings with lower-cost, specialty AI chips to address that important part of the market.

Nvidia continues to enhance its GPU technology to close the performance gap vis-à-vis other chip architectures. One notable recent milestone was the recent release of AI industry benchmarks that show Nvidia technology setting new records in both training and inferencing performance. The company’s forthcoming new AI-optimized Jetson Xavier NX hardware module will offer server-class performance, a small footprint, low cost, low power, high performance and flexible deployment for edge applications.

The competitive landscape

With an annual revenue run rate nearing $12 billion, Nvidia retains a formidable lead over other AI-accelerator chip manufacturers, especially AMD /zigman2/quotes/208144392/composite AMD -6.52% and Intel /zigman2/quotes/203649727/composite INTC -2.09% .

Intel, however, has upped its game in AI inference with the recent release of multiple specialty AI chips and the recent announcement that Ponte Vecchio , the company’s first discrete GPU, should hit the market in 2021. There is also a range of cloud, analytics and development tool vendors who have flocked into the AI space over the past several years.

Nvidia’s early lead can be attributed to the company’s focus, as well as the deep software integration that enables developers to rapidly develop and scale models on its hardware. This is why many of the hyperscalers (Alphabet’s /zigman2/quotes/205453964/composite GOOG -1.05%   /zigman2/quotes/202490156/composite GOOGL -1.09% Google Cloud, Microsoft’s /zigman2/quotes/207732364/composite MSFT -2.34% Azure, Amazon’s /zigman2/quotes/210331248/composite AMZN -2.14% AWS) also deliver AI inference capabilities on their infrastructure based upon Nvidia technology.

In edge-based inferencing, where AI executes directly on mobile, embedded, and devices, no one hardware/software vendor is expected to dominate, and Nvidia stands a very good chance of pacing the field. However, competition is intensifying from many directions. In edge-based AI inferencing hardware alone, Nvidia faces competition from dozens of vendors that either now provide or are developing AI inferencing hardware accelerators . Nvidia’s direct rivals—who are backing diverse AI inferencing chipset technologies—include hyperscale cloud providers AWS , Microsoft , Google , Alibaba /zigman2/quotes/201948298/composite BABA +0.12%  and IBM /zigman2/quotes/203856914/composite IBM -0.28% ; consumer cloud providers Apple /zigman2/quotes/202934861/composite AAPL -2.97% , Facebook /zigman2/quotes/205064656/composite FB -2.61% and Baidu /zigman2/quotes/209050136/composite BIDU -0.37% ; semiconductor manufacturers Intel , AMD , Arm , Samsung , Qualcomm /zigman2/quotes/206679220/composite QCOM +2.32% , Xilinx /zigman2/quotes/209389378/composite XLNX -1.73% and LG ; and a staggering number of China-based startups and technology companies such as Huawei.

Nvidia’s strong growth prospects

The significant opportunities tied to the growth of AI inferencing will drive innovation and competition to develop more powerful and affordable solutions to leverage AI. With the deep resources and capabilities of most of the aforementioned competitors, there is certainly a possibility of a breakthrough that could rapidly shift the power positions in AI inferencing. However, at the moment, Nvidia is the company to beat, and I believe this strong market position will continue for at least the next 24 months.

With Nvidia placing an increased focus on low-cost edge-based inferencing accelerators as well as high-performance hardware for all AI workloads, the company provides widely adopted algorithm libraries, APIs and ancillary software products designed for the full range of AI challenges. Any competitor would need to do all of this better than Nvidia. That would be a tall task, but certainly not insurmountable.

Daniel Newman is the principal analyst at Futurum Research. Follow him on Twitter @danielnewmanUV. Futurum Research, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the tech and digital industries. Neither he nor his firm holds any equity positions with any companies cited.

/zigman2/quotes/200467500/composite
US : U.S.: Nasdaq
$ 434.00
-12.60 -2.82%
Volume: 8.86M
Aug. 11, 2020 4:00p
P/E Ratio
81.05
Dividend Yield
0.15%
Market Cap
$274.66 billion
Rev. per Employee
$882,428
loading...
/zigman2/quotes/208144392/composite
US : U.S.: Nasdaq
$ 76.88
-5.36 -6.52%
Volume: 77.88M
Aug. 11, 2020 4:00p
P/E Ratio
150.51
Dividend Yield
N/A
Market Cap
$96.55 billion
Rev. per Employee
$641,089
loading...
/zigman2/quotes/203649727/composite
US : U.S.: Nasdaq
$ 48.19
-1.03 -2.09%
Volume: 35.35M
Aug. 11, 2020 4:00p
P/E Ratio
8.87
Dividend Yield
2.74%
Market Cap
$209.33 billion
Rev. per Employee
$659,665
loading...
/zigman2/quotes/205453964/composite
US : U.S.: Nasdaq
$ 1,480.32
-15.78 -1.05%
Volume: 1.45M
Aug. 11, 2020 4:00p
P/E Ratio
33.02
Dividend Yield
N/A
Market Cap
$1017.84 billion
Rev. per Employee
$1.39M
loading...
/zigman2/quotes/202490156/composite
US : U.S.: Nasdaq
$ 1,480.54
-16.28 -1.09%
Volume: 1.55M
Aug. 11, 2020 4:00p
P/E Ratio
33.02
Dividend Yield
N/A
Market Cap
$1017.84 billion
Rev. per Employee
$1.39M
loading...
/zigman2/quotes/207732364/composite
US : U.S.: Nasdaq
$ 203.38
-4.87 -2.34%
Volume: 36.45M
Aug. 11, 2020 4:00p
P/E Ratio
35.31
Dividend Yield
1.00%
Market Cap
$1575.96 billion
Rev. per Employee
$902,473
loading...
/zigman2/quotes/210331248/composite
US : U.S.: Nasdaq
$ 3,080.67
-67.49 -2.14%
Volume: 3.72M
Aug. 11, 2020 4:00p
P/E Ratio
118.41
Dividend Yield
N/A
Market Cap
$1576.88 billion
Rev. per Employee
$359,671
loading...
/zigman2/quotes/201948298/composite
US : U.S.: NYSE
$ 248.42
+0.29 +0.12%
Volume: 10.68M
Aug. 11, 2020 4:00p
P/E Ratio
30.87
Dividend Yield
N/A
Market Cap
$671.25 billion
Rev. per Employee
$781,259
loading...
/zigman2/quotes/203856914/composite
US : U.S.: NYSE
$ 126.75
-0.36 -0.28%
Volume: 5.00M
Aug. 11, 2020 4:00p
P/E Ratio
14.38
Dividend Yield
5.14%
Market Cap
$113.20 billion
Rev. per Employee
$208,845
loading...
/zigman2/quotes/202934861/composite
US : U.S.: Nasdaq
$ 437.50
-13.41 -2.97%
Volume: 46.98M
Aug. 11, 2020 4:00p
P/E Ratio
33.25
Dividend Yield
0.75%
Market Cap
$1927.93 billion
Rev. per Employee
$1.98M
loading...
/zigman2/quotes/205064656/composite
US : U.S.: Nasdaq
$ 256.13
-6.87 -2.61%
Volume: 28.24M
Aug. 11, 2020 4:00p
P/E Ratio
31.29
Dividend Yield
N/A
Market Cap
$749.10 billion
Rev. per Employee
$1.57M
loading...
/zigman2/quotes/209050136/composite
US : U.S.: Nasdaq
$ 124.60
-0.46 -0.37%
Volume: 2.20M
Aug. 11, 2020 4:00p
P/E Ratio
123.92
Dividend Yield
N/A
Market Cap
$43.10 billion
Rev. per Employee
$365,691
loading...
/zigman2/quotes/206679220/composite
US : U.S.: Nasdaq
$ 108.83
+2.47 +2.32%
Volume: 17.80M
Aug. 11, 2020 4:00p
P/E Ratio
46.03
Dividend Yield
2.39%
Market Cap
$120.00 billion
Rev. per Employee
$607,486
loading...
/zigman2/quotes/209389378/composite
US : U.S.: Nasdaq
$ 104.61
-1.84 -1.73%
Volume: 2.08M
Aug. 11, 2020 4:00p
P/E Ratio
40.82
Dividend Yield
1.45%
Market Cap
$26.01 billion
Rev. per Employee
$723,353
loading...

This Story has 0 Comments
Be the first to comment
More News In
Industries

Story Conversation

Commenting FAQs »
Link to MarketWatch's Slice.