9/23/24

greedy semiconductors, anthropomorphized --ar 3:2 --sref 3185620147 --v 6.1ca

greedy semiconductors, anthropomorphized --ar 3:2 --sref 3185620147 --v 6.1ca

Note: The below represents my personal point of view, and is not an official opinion held by my employer, Tower Research, its affiliates, nor any of their private investments.

A (somewhat) odd thing I noticed recently was that a number of semiconductor companies building for AI/ML are getting directly into the open source inference business. Cerebras has launched an inference API for open source models like Llama 3.1, following the lead of Groq. SambaNova has also released an API as of 9/11/24.

I can’t think of too many times a hardware/semi company has moved up the stack to vertically integrate someone else’s software into their business. Is this a harbinger of a structural change in how chip companies monetize long term? Or just a form of competitive strategy/demand gen to stand out against more notable firms? It’s hard to say who benefits long term or loses bigly, but let’s speculate.

Visualizing the Value Chain

In a “normal” market, you could roughly generalize the flow of a service from semi company up to the developer building an application for an end user in the following chain:

Old Chain:

IMG_0905.HEIC

By contrast, companies like Groq and Cerebras seem to be restructuring the chain to look like this:

New Chain:

IMG_0906.HEIC