• GritALPHA
  • Posts
  • A Full Analysis of Sandisk ($SNDK)

A Full Analysis of Sandisk ($SNDK)

Discussing a corner of the AI market that could be underappreciated...

Hi everyone,

Welcome back to another edition of GRIT Alpha. Today we’re going to talk about a portion of the AI market that’s often overlooked. Let’s dive right in!

Stock Pick: Sandisk (SNDK-US, $30B MCAP)

The entire market has AI fever. This is no surprise.

But we’ve seen a big pullback here in tech names, and it could be time to sharpen your pencil and start to buy up some of these names at better prices.

Let’s zoom out.

Megacap tech earnings told us that companies are not slowing down their capex any time soon and it’s still full steam ahead (for now). The goalposts continue to push out in terms of just how much capital is being spent on this expansion.

At the same time, we are still early in adoption when integrating into workflows on the application layer. Recent headlines have now pointed to nearly $3T in infrastructure spend in the coming years.

While investors are well familiar with the sexy Nvidia play, I like corners of the market that are much quieter.

This pick is in a really boring space - but one that is going to see a ton more investment. Memory chips.

Let’s dig in!

  • Why now? 👉 AI-Fueled Second Act

  • Overview 👉 What Does Sandisk Do?

  • Role in Ecosystem 👉 How Memory Fits into AI Infrastructure

  • How Do They Win? 👉 Value Proposition

  • Business Units 👉 Segment Breakdown

  • How Do They Make Money? 👉 Revenue Model

  • Momentum 👉 Recent Key Contracts and Partnerships

  • By The Numbers 👉 Key Metrics

  • Competition and Outlook 👉 Navigating the Memory Cycle

  • Risks 👉 Potential Pitfalls

But what can you actually DO about the proclaimed ‘AI bubble’? Billionaires know an alternative…

Sure, if you held your stocks since the dotcom bubble, you would’ve been up—eventually. But three years after the dot-com bust the S&P 500 was still far down from its peak. So, how else can you invest when almost every market is tied to stocks?

Lo and behold, billionaires have an alternative way to diversify: allocate to a physical asset class that outpaced the S&P by 15% from 1995 to 2025, with almost no correlation to equities. It’s part of a massive global market, long leveraged by the ultra-wealthy (Bezos, Gates, Rockefellers etc).

Contemporary and post-war art.

Masterworks lets you invest in multimillion-dollar artworks featuring legends like Banksy, Basquiat, and Picasso—without needing millions. Over 70,000 members have together invested more than $1.2 billion across over 500 artworks. So far, 23 sales have delivered net annualized returns like 17.6%, 17.8%, and 21.5%.*

Want access?

Investing involves risk. Past performance not indicative of future returns. Reg A disclosures at masterworks.com/cd

Why now? 👉 AI-Fueled Second Act

Sandisk has flipped from a sleepy memory brand to a core AI infrastructure supplier. Since the February 2025 spin, shares jumped from under $30 to above $200 as the company rode a sharp turn from industry oversupply to undersupply. What changed is AI. Training and inference need fast, affordable non-volatile storage at scale, and flash now sits on the critical path between data lakes and accelerators. Street sentiment followed the numbers, with upgrades and targets moving rapidly as guidance reset higher.

The investment case today pairs cyclical tailwinds with cleaner execution as a pure-play NAND (a type of memory) company. Supply discipline across the oligopoly, faster node transitions, and richer mix into enterprise strengthen pricing and margins. Risks remain tied to the cycle and rivals, but the setup leans favorable while hyperscaler qualifications convert. Memory is no longer just a commodity input. It is a gating resource for AI throughput, and Sandisk’s timing, balance sheet improvement, and product cadence give it real leverage to this buildout.

Key acronym guide:

  • DRAM: Dynamic Random-Access Memory

  • HDD: Hard Disk Drive

  • SSD: Solid-State Drive

  • SLC: Single-level Cell

  • QLC: Quad-Level Cell

  • NVMe: Non-Volatile Memory Express

  • NAND: Not AND (as in NAND flash memory)

  • OEM: Original Equipment Manufacturer

  • ASP: Average Selling Price

Overview 👉 What Does Sandisk Do?

Sandisk designs, builds, and sells NAND flash and flash-based storage spanning data center SSDs, client SSDs, embedded mobile and automotive, and branded retail products. After its 2016 acquisition by Western Digital, Sandisk returned to public markets in early 2025 as a focused flash business with the SanDisk consumer brand and deep OEM channels.

Technology depth is the core asset. The company co-operates high-volume fabs with Kioxia, shipping advanced 3D NAND such as BiCS8 at 218 layers and ramping QLC for very high capacities. Vertical integration across NAND, controllers, and firmware lets Sandisk tune reliability, endurance, and cost per bit for each workload. That enables distinct offerings from PCIe Gen5 NVMe drives for servers to UFS for smartphones and rugged modules for vehicles. Global scale, a broad tech stack, and long-standing device maker relationships keep Sandisk positioned in both consumer and enterprise supply chains. I know the space can be a little bit of an alphabet soup but keep this in mind: the company turns wafers into storage systems that prioritize density, cost, and predictable performance for high-volume use.

Role in Ecosystem 👉 How Memory Fits into AI Infrastructure

AI is throughput hungry. Datasets, checkpoints, embeddings, and retrieval all must move quickly from storage to GPUs. Flash is the middle tier that prevents idle accelerators, bridging cold object stores and hot DRAM. In training, petabytes stream from all-flash arrays to GPU clusters; in inference, model weights and vector indexes often sit on NVMe SSDs that feed accelerators at low latency.

As data sizes scale faster than DRAM budgets, flash becomes the elasticity valve for performance per dollar. Sandisk’s portfolio targets that bottleneck, from high-endurance NVMe to ultra-capacity QLC for model repositories. The company is also pushing High Bandwidth Flash, a concept that places flash closer to compute to shrink access time for large models. At the edge, phones, cars, cameras, and IoT devices rely on embedded flash to hold models and logs. In the cloud, hyperscalers are shifting to larger all-flash pools to standardize performance for AI pipelines. Storage is not the headline, but it is the enabler.

How Do They Win? 👉 Value Proposition

Subscribe to GritAlpha Premium to read the rest.

Become a paying subscriber of Premium to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

A premium subscription gets you:

  • • Three (3) Deep-Dive Stock Analysis Newsletters Each Month