Semiconductor world - CPU/GPU Wars

Note: This is not about Indian listed cos
Folks,
I see some interest in the markets outside India in this forum. Harsh Beria saw me mention NVDA and AMD in a thread and asked me if I can explain the industry landscape a bit. I decided to put it as a post to benefit others. Only Industry landscape. No financials. I am not good at reading financials.

If you find this thread not suitable for valuepickr post, @harsh.beria93, It is your fault :blush:. I don’t mind deleting this post.
Edit: I am deleting the doc and expanding the contents in this threads.

Disclosure: holding some shares of amd for 4-5 years.

22 Likes

Introduction To Industry in Focus

Firstly, we are talking about the semiconductor industry landscape. And in that, the knowledge of this author is restricted to only the brains of the machine (Not diode/led/modems/ etc.)

  1. CPU (Central Processing Unit) – Desktop and Laptops. Leaving mobile phones out of the picture. I will explain why later. This thing is the brains in a computer
  2. GPU (graphics processing unit) – We are focusing only on discrete GPU. NWe are interested in those that go into datacenters/used for compute. Not integrated GPU, that goes into laptops. Discrete GPUs in datacenters are the real power horse. This thing that used to only help with graphics a decade ago. And that which now is used for computation (Math… in short). All the crypto mining craze that happened, GPUs enabled it. https://www.researchgate.net/figure/Graph-detailing-incredible-performance-increase-over-the-last-several-years-of-the-GPU_fig2_26547680

What do you care?

Secular growth possibility: Well, it would not be a big deal during normal times. But, if you think covid is disrupting workplace like never before and that online work is the future, then you cannot miss the brains behind every single cloud infrastructure – The processors. Games, virtual reality, Cloud, Computing.Research folks know. Especially if you had to book slots in super computers in 90s and 2000s for doing some big compute. If the growth is fast enough, then not one company will be able to supply all datacenter processor needs. To top it off, Intel the giant is faltering in its ‘fab’.

Terminologies

Fabs
Processors are all made from sand. Yes, just sand to make these things. Some high end technology is what bridges the gap from sand to CPU/GPU. The place where this happens is called a fab - https://en.wikipedia.org/wiki/Semiconductor_fabrication_plant

OEM
Other Equipment manufacturers. The guys that put together laptop/desktop. HP/DELL/LENOVO.

Process Node https://www.howtogeek.com/394267/what-do-7nm-and-10nm-mean-and-why-do-they-matter/ - This is important – understand what is process node before proceeding.

Do not miss reading about process node. In short, lesser your current process node - more efficient your processor for the same performance. And corollary to that, you could maintain the same power envelope as your competitor with same design and give more performance this way. The leading players are Intel,TSMC & Samsung(recent entry) in fabs.

Data Center
Processing farm . The infrastructure behind your cloud. https://www.youtube.com/watch?v=kfvbCggY_nI

Perf/watt
How much performance you get per watt.

Perf/$
People talk about perf/watt mainly influenced by mobile space. Due to limited battery life.
With datacenters, people are now talking about perf/$. How much does it cost to get certain performance.

ISA
Instruction Set Architecture – The language that a processor understands. Intel’s language is x86 and ARM’s is well, ARM. ISA impacts design. ISA itself is licensed. Apple has licensed ARM ISA (lifelong or something that). They design their own processors on top of ARM ISA. They design better performing ARM processors than ARM! And design is what we are paying for. Some design is power hungry and some other use less power for ’similar’ perf. ISA plays a major role here. Lines are blurring now because ARM, which was known for low power consumption, also makes very complex CPUs now (recent X-series announcement), which will consume more power – We can go deeper here but we will stick to macro view here. I will avoid this part until we talk about datacenter and perf/$ numbers.

Core
Each core is one ‘brain’. One can put multiple brains in a die and it becomes a multi core processor

Thread
Multiple execution paths in a core. But shares some stages. Imagine execution of jobs in a pipeline. If there are 10 stages and 2 execution paths(thread) with 3 stages being shared among them along with one level of memory.

Workload
What a processor runs. Different workloads sweat different processors in different ways. Some workloads use all cores, some workloads extract the max out of one core.

Edge compute
Opposite of cloud compute. First there will be many tasks moving to cloud and then, the expectation is that there will be an outward movement to ‘edge’ devices.

16 Likes

Market Segments

Why does it make such a good discussion? Because there is window of change opening right now. Thanks to intel’s misses and growing cloud needs.

With it, come companies that can gain market share and grow. Right in the mix are two companies – NVDA and AMD. We are in the early stages. The stock prices are reflecting this change. How much is priced in? – Needs a analysis of the current + future market + company balance sheet

PLAYERS

NVDA

Predominantly manufactures GPU. Has been trying its hands on CPU for quiet some time (https://en.wikipedia.org/wiki/Project_Denver) but has had a difficult time. Recently announced it is acquiring ARM. ARM licenses its design/ISA to companies. 95% of smartphones in the world have arm designed CPUs. I will talk about ARM later when I discuss datacenter. Also, NVDA is at the forefront on Artificial Intelligence technology.

They have been able to beat the middleman OEM ecosystem by

  • Great Leadership - Jensen Huang - Founder + CEO
  • Better marketing
  • Providing a software stack that makes customer want to come to them – This has the customers telling the OEM to get them NVDA GPUs, so OEMs cannot play the game of reducing price between NVDA and AMD.
  • The open source community tends to be unhappy with them – They make you pay for everything (And their code change to open source is along similar lines) - https://www.youtube.com/watch?v=IVpOyKCNZYw
  • They acquired ARM recently – gives them a humongous market that arm caters to. Arm has 95% mobile and 90% embedded market share. You will hardly hear about it because ARM does not manufacture processors – it designs them and sells IP. This gives a huge market to NVDA edge compute plans. It could tag along with arm processors – How they do it is to be seen
  • Acquisition of ARM gives them options to ship entire package to datacenter. When it comes to datacenter - compute is the real deal and GPU does it. CPU is mostly shifting data in and out of GPU.
  • Thinks very far ahead – in ~2009 they wanted compute in edge devices/cellphones – They now acquired ARM.

AMD

Ever since they acquired ATI (GPU makers), AMD has become the sole company that actually makes solid GPU and CPU. Nope, not intel. Not nvidia (not until it acquired ARM 2 weeks back). AMD makes them both that competes with both intel and nvda from low end to high end.

  1. Historically, has been competing with big brother Intel
    a. Beaten black and blue by intel - Own execution mistakes
    b. Bad biz practices (of intel) and Intel had to pay a penalty - https://arstechnica.com/information-technology/2009/11/intel-and-amd-bury-the-hatchet-under-125-billion-in-cash/
    c. Couple of major bugs in its processors + Very difficult to compete with intel. Intel will sell you out of the game – Cannot find the link now. Old one – 2008-2009 timeframe.
  2. A huge turnaround from nowhere in 2016 – they are now at 20% market share in desktop and 20% in notebooks
    https://www.pcworld.com/article/3569437/amds-notebook-pc-share-climbs-to-an-all-time-high-of-nearly-20-percent.html
  3. Came back under Lisa Hsu’s leadership – And thoroughly beat intel in 2020 - https://www.youtube.com/watch?v=-5-k5l8aaKY - MIT PhD
  4. The foundation for its latest tech was laid down somewhere in ~2013-2014 (Arround the time Jim Keller was there) – Due to which they have been able to scale up the number of cores in a CPU. Ground up architecture change about how their design looks like
  5. Usually the one with the best price/performance
  6. ‘Not so great software’ reputation – Could be also because usually used in budget laptops with cheap peripherals
  7. They already have CPU + GPU for server racks - what are they doing? Will NVDA beat them to it?

INTEL

You know about intel, so no intro. I will mention points relevant to our discussion

  1. Intel had been milking the desktop space for 5 years till 2019 as a monopoly. Because AMD was doing nothing to dent its leadership – They would literally tell what perf one can get in a given year.
  2. 98% market share in datacenter – This is Cash cow for intel. Stability is paid premium. Intel grew the ecosystem, they dominate it. AMD was getting something here and there. But intel is the boss.
  3. Historically – Their fabs have had the best yield
6 Likes

What has happened in 2020?

  1. AMD whacked intel in performance + watts with announcements in CES 2020 in desktop and mobile -youtube has videos
    It was so bad that a mobile AMD chip was compared to intel desktop chip. In CES 2020, Intel was busy showcasing third party products. https://www.youtube.com/watch?v=-5-k5l8aaKY
    • What this does is – Earlier OEMs would pair AMD CPU with 2nd grade /budget grade laptops with non premium parts. First time in history AMD is in many high end laptops. May be second time. Some 10 years ago may have been first time. What this means is customers will form a better impression of AMD. https://www.youtube.com/watch?v=4V3uB12mRrU
    • Linus Torvalds moves to AMD threadripper - https://www.overclock3d.net/news/cpu_mainboard/linux_creator_linus_torvalds_moves_to_amd_after_15_years_of_intel_systems
    • They are throwing cores at customers – Lot of software is not yet ready to make use of this. A challenge given to SW developers. Personal View: Last time I saw something like this was when GPU was challenging the SW developers to use its full power for compute some 13 years ago. 64 core/128 thread in desktops!!
  2. AMD‘s fab partner – TSMC came first with 7nm node. They announced 5nm in 2021 and moved to 3nm.
  3. Intel fumbled around with their process node and finally gave an answer few weeks back – But still cannot beat AMD in price/perf and AMD has quiet a few high end CPUs from last year’s zen2 up there with current intel top end - https://www.tomshardware.com/uk/features/amd-vs-intel-cpus
  4. AMD zen 3 is going to be announced in October – expected to beat current intel until next year mid (until intel comes back with an answer) - https://www.youtube.com/watch?v=vm3VvEYyPrw
  5. NVDA acquired ARM
  6. In september - NVDA announced GPUs matching the same perf as last high end GPU at 1/3 the price. Also, usually new product announcement is in oct. This time, they know AMD can compete. So they are creating an atmosphere of low supply + giving rebates to OEMs. GPU war is on. (https://www.youtube.com/watch?v=SxtfNcm45xk)
  7. Current fastest Supercomputer uses ARM (NVDA) ISA - https://www.top500.org/lists/top500/2020/06/ - So arm is not just capable in power starved mobile phones but also in high performance applications. They have recently started a new range of application processor that focuses on performance
  8. Ampere is making progress with ARM based Graviton in datacenter – Very good perf/$ numbers. Amazon has it in its EC2 instances https://www.anandtech.com/show/15578/cloud-clash-amazon-graviton2-arm-against-intel-and-amd/9
  9. Datacenter has not seen such incursions in decades
10 Likes

Outlook

Cloud growth https://www.marketsandmarkets.com/Market-Reports/cloud-computing-market-234.html
Humongous cloud gaming market growth expected https://www.marketsandmarkets.com/Market-Reports/cloud-gaming-market-62740366.html

CPU

AMD

  • AMD has better performance at or below the same price
  • It is coming with zen3 now – Some perf estimates here https://www.youtube.com/watch?v=vm3VvEYyPrw – I give this same channel again and again because his reasoning makes sense to me with what I know and his predictions check out in the ‘future’.
  • TSMC – AMD will come with 5nm processors in 2021. Before Intel in 2022
  • Intel’s fabs have lagged – Serious management issues. They struggled to get 7nm equivalent out. Will they make it to 5nm equivalent in 2022?
  • Will AMDs market gain in desktop and notebook translate to datacenter market growth – This is the growth area. This is the window for AMD – There are some signs https://www.datacenterknowledge.com/amd/amd-gains-server-chip-market-share-intel-s-expense

NVDA

  • Amazon deploying ARM based CPUs. (ARM is now owned by NVDA)
    “ Amazon was able to deliver on its promise of 40% better performance per dollar, and it’s a massive shakeup for the AWS and EC2 ecosystem.” - https://www.anandtech.com/show/15578/cloud-clash-amazon-graviton2-arm-against-intel-and-amd/9
  • ARM’s is the fastest super computer at this point in time.
  • Apple is moving to their own ARM ISA based silicon - This proves that ARM design can grow in laptops. How about desktops (stretching here)? This disruption can happen. ARM already has a strong SW ecosystem

INTEL

  • Defending against AMD in CPU
  • Customers do not switch fast in datacenters

GPU

NVDA

AMD

  • AMD is matching NVDA in GPU hardware. There is pressure on NVDA
  • But: GPU War is an old one. There will be ups and downs here. Traditionally NVDA has had the upper hand in sales because of their software stack quality + leadership. We must monitor AMD resurgence.

INTEL

  • Hardly any GPU footprint in datacenters
3 Likes

Personal banter

My bet is on intel losing more and more market + the market itself outgrowing intel (secular growth possibility). Why? Because intel, historically, had great fab (Do read up about fab yield) and managed to be ahead by unfair business practices time and time that went unnoticed earlier https://www.youtube.com/watch?v=VL1RjwVAnzY)

In tech space, if you make your competitor miss the bus by one year, they will lag for years. That is what intel did. When intel filed a patent targeting nvda chipset business, nvda just left the biz. https://www.cnet.com/news/intel-takes-chipset-dispute-with-nvidia-to-court/

But now, we have very big players against it

  • Amazon backed Annapurna labs
  • NVDA is not small anymore
  • Regulatory is much ‘better’ now (From the Mike Bruzzone interview)
  • A ballooning cloud market.
  • Leadership
    • NVDA and AMD have stellar leaders from semiconductor industry.
    • NVDA -Jensen is the founder with electrical engineering degree from Stanford. These guys changed compute landscape. Brought super computer to desktop.
    • AMD - Lisa Hsu - MIT Ph.D (Electrical Engineering all the way)
    • Intel’s Bob Swan - MBA . Not belittling MBAs, but he will need some trustworthy highly capable lieutenant’s with less ego to make technical calls. In high tech industry like semiconductors, this is a problem. Some good discussion about such leadership in this video https://www.youtube.com/watch?v=O4DgXtxkZNg

Intel has money though. Truckloads of them. They are currently doing a price war to match AMD perf advantage. This is playing defense. And they are in 7nm (they call it 10 ++ or something) only now. If they do not come up with 5nm equivalent in 2022, then no offense till 2024. They may have to skip a node to come back to compete? Until then, use the existing entrenched ecosystem and survive. Lack of leadership and huge org is not helping either.

https://www.ft.com/content/bb1069c6-a273-4d43-9cf6-0340ca88f711

Amazon is trying out ARM based CPUs. It acquired Annapurna labs for this purpose. Amazon has money and it runs its own datacenters. This is a combination not seen before in datacenters (of running datacenter + designing CPUs). So, if ampere can deliver, then it is win for NVDA (Ampere licenses ARM design) and lesser market for Intel.

Interesting times ahead with few players.
A few questions that one can dig in regarding investment

  1. What is the market share + growth potential
  2. How healthy are balance sheets of AMD & NVDA
  3. Should one instead just buy AMD + Intel + NVDA in proportion of bet - Two big guys (CPU & GPU) in data center and the underdog(AMD)?
  4. Datacenter change will take years because of its inherent nature - 5-10 years to topple intel out (if that happens i.e.). So any datacenter based stock bet has to have a long horizon. May be allocate more every year as the story pans out?
  5. Another proxy play is TSMC
10 Likes

Thanks for initiating this wonderful thread. I found these videos also quite useful in understanding ARM’s business and how NVIDIA can benefit if they can integrate ARM.

1 Like

Nice effort in this thread. This is an industry I follow as some friends work in these companies. To me the most significant blindness of the market was in 2016 when the 1st gen Zen architecture showed a lot of promise and wasn’t even completely under wraps as it was demoed in E3 2016 mid-year (the SoCs hit the market only mid 2017) and yet AMD was languishing under $2/share. In hindsight it looks silly but the truth is even people who were working on the chip were not convinced enough to buy the shares.

As far the GPU game goes, NVDA has turned the game on its head with the power and pricing RTX 30 series. I personally can’t wait to get one. As a macro theme, the GPU has been the compute game to be in as we do more and more AI computation going forward. We are going to bypass mother nature and grow our cerebral cortex inorganically in the data center. Valuation now though isn’t cheap. Btw, don’t see any mention of Google’s TPU in this thread - that’s definitely worth looking into. We are using Jetson Nano for something recently which can probably be done better with Google’s Coral Edge TPU (No availability).

8 Likes

Thank you all for the great insights here.

Do we currently have any Indian companies listed in this space that we feel positively about? I dont know if Sasken would fall under this domain particularly.

This should be a major structural story as we move ahead.

AFAIK, only ibm makes server cpu other than the trio i mentioned. IBM’s power cpus (non x86).

Yes, TPU is an area where competition is different. I did not bother about it thinking companies like google are yet to make a mark to dislodge GPU from data centers. I need to read up about current state in data centers.

Also, regarding NVDA RTX30, wait for RDNA2 if you can. It is only a month away. Don’t fall for marketing :slight_smile:

To give a perspective on how NVDA was milking customers due to lack of competition before AMD upped the game this year. Some smart folks sold TI 2080 2 months back at 1000$. Actually, even after RTX30 release, some less aware folks were buying at that price. NVDA could very well be creating a fake shortage of high end cards to spike ‘demand’ (scarcity).

https://www.youtube.com/watch?v=2upZSyQHwNA

Don’t mistake me, I would have bought NVDA shares if I was market aware then.

Note that, just in september, the yield was bad. But now are they stuffing channel as fast as they can?

https://twitter.com/Avery78/status/1312820491833364480

Edit: For everyone’s information. SW is a differentiator. Those who like CUDA will go for rtx3000 series

Another perspective. Their marketing is great (Really, they are good). A long time back, they actually marketed this GPU to people and came back to make a spoof video about it afterwards. Customers are not that tolerant nowadays, I think mostly due to the cost that a GPU commands and more serious applications nowadays compared to earlier days? I need to talk to some very old GPU folks.

Zen3 Ryzen revealed (Ryzen is desktop CPU)
19% increase in IPC (google for the meaning of IPC) over zen2 at same socket power and same TDP. Means better perf per watt. This is very important for datacenter market. Waiting on EPYC Milan server CPUs.

What matters for us - AMD is finally pricing their products above Intel’s. They are making a statement. Never seen this happen at least for the 10 years I am aware of. They are pretty much telling that the argument is over.

Also, there is a tease of what is coming your way with Radeon.
One can compare the fps numbers with what is presented here. Please remember, the numbers that AMD teased could have been with rhyzen 5000 which could spike the fps number a little. Summary: We could be having something that competes with rtx3080 but some 15% below rtx3090.

@phreakv6 You touched on something I have been asking myself for long. I have pondered over why I did not notice AMD as undervalued in mid 2017 when zen1 came in.

The answer I can come up with is - Bulldozer/History. There were a lot of expectations from the Bulldozer architecture, but it did not really go through. Architecture is one thing, but the sales numbers on the ground is a mix of performance + execution by marketing & sales + How smooth is the new arch adoption - There were bios + other issues during zen1. I personally think that for those in the semiconductor world, late 2018 is the time that they should have definitely noticed looking at improving stability. Even then, one could not have seen through intel blunders in fab + the zen architecture scaling so well in the same power envelope (with help from tsmc). Who would have thought, the guys who gave the moore’s law would sit on it trying to disprove it by poor execution?

I personally feel it would be a crime now to not notice the duopoly of AMD & NVDA at play here in GPUs.

Another incursion in datacenter - FPGAs
Xilinx CTO talking about FPGAs in datacenters in the future

Also, rumours of AMD in advance talks of aquiring xilinx!

Regarding TPUs. I found the answer unexpectedly when researching amd xilinx rumours.

I think FPGA is the winner here. You can adapt your hardware fast to your software.
An old article detailing the benefits

For those who do not know FPGA - Think of it like a processor, but, you can program the design of the processor on the fly to meet your needs. Example: If all you wanted to do was addition, why do you need your processor that say, does division? ASICs (processor designed for specific purpose) need to go through all design cycle and then fab and then production. FPGAs, you can simply program large ‘addition’ array and get on with it! This also helps tailor your HW to changing Neural Net structures.

Let us see what happens with the aquisition rumour. This is getting even more interesting.

1 Like

A good video that touches on multiple areas that are related to process node + production ease + yield. I don’t like the snapshot of the video. I will remove if anyone objects. Somehow appears whacky for an investment forum. The content is good though.

1 Like

@kenshin: Market is pricing this transition now. A nice overview of how intel is losing this game

1 Like

Two CEOs together. ARM and NVDA.
This expanded my view on the impact of the acquisition.

1 Like