Tuesday, January 6, 2026

A Warning About the Future of AI

dylan-diary-email-header-btm


You are receiving this email because you are subscribed to Behind the Markets. If you no longer wish to receive these emails, please unsubscribe here.


Dear Reader,

Happy Tuesday.

Today is Tuesday, January 6th — and today I want to talk to you about something fascinating I read over the weekend.

It really stuck out to me, because it completely turns one of the biggest assumptions people are making right now about AI on its head.

There's a popular belief out there that the future of artificial intelligence is basically one giant arms race of data centers. 

Bigger campuses. More GPUs. More power. More cooling. 

Trillions of dollars poured into centralized infrastructure to run increasingly massive models….

And then Aravind Srinivas, the CEO of Perplexity comes along and makes us think: what if that whole assumption is wrong?

Srinivas warns, "The biggest threat to a data center is if the intelligence can be packed locally on a chip... and there's no need to inference all of it on one centralized data center."

Meaning, if AI models become efficient enough to run directly on devices — laptops, phones, tablets, wearables — then the need to constantly ping massive cloud-based data centers starts to shrink.

And that matters, because right now we're watching companies commit staggering amounts of capital to centralized compute. 

Entire strategies are being built on the idea that intelligence has to live in giant server farms, far away from the end user.

What he's really pointing out is the potential for there to be two very different paths for the future of AI.

One path says: AI keeps getting bigger, more centralized, more power-hungry, and more dependent on massive infrastructure.

The other path says: AI gets smaller, more efficient, more local, and starts living closer to the user.

If that second path accelerates faster than expected, a lot of assumptions people are making today will need to be revisited.

Now, to be clear, this doesn't mean data centers suddenly become useless. Training large models still requires enormous compute. Certain workloads will always live in the cloud. Enterprises will still rely on centralized infrastructure.

But inference — the everyday use of AI — is where things get interesting.

If your iphone can run an advanced model locally, you get faster responses, better privacy, lower latency, and far less reliance on constant connectivity. You're not shipping every query off to a remote server and waiting for an answer to come back.

That's a fundamentally different experience.

And it's not theoretical. You can already see the direction this is moving. Chips are being designed specifically for AI workloads at the edge. Software is being optimized to do more with fewer parameters. Companies are working hard to compress models without destroying performance.

This is the same pattern we've seen before in computing.

Mainframes gave way to personal computers. 

PCs gave way to smartphones. 

Centralized systems tend to dominate early, then gradually give ground to more distributed, user-level power as the technology matures.

What makes this moment tricky is timing.

Right now, the world is in the middle of a massive buildout based on today's architecture. 

Power infrastructure is being designed around the assumption that AI demand will keep flowing into centralized hubs.

If on-device AI matures faster than expected, that doesn't mean those investments vanish — but it does mean returns, timelines, and bottlenecks may look very different than people are modeling today.

That's the real takeaway here.

This isn't about one CEO predicting doom for data centers. It's about recognizing that technology almost never moves in a straight line. The biggest surprises usually don't come from the obvious trends everyone's watching — they come from the second-order shifts most people aren't paying attention to yet.

Everyone sees the AI boom. Everyone sees the GPU shortages. Everyone sees the power constraints.

Far fewer people are thinking seriously about what happens if AI starts migrating to personal devices.

And this is exactly the kind of thing we like to pay attention to early — not because it gives you an immediate answer, but because it forces you to ask better questions.

Questions like: where does value really accrue if AI moves closer to the device? Is it still in raw compute, or does it shift toward silicon design, software efficiency, and integration?

That's why for a while now we've been talking about the one AI stock we feel is set to profit from this more than any other.


Because this under-the-radar AI play will benefit no matter which direction AI moves.


Because they aren't investing billions in infrastructure and data centers… 


Instead, they are quietly working with companies like NVIDIA that are.


They also design architecture for almost 100% of the mobile market.


Every time Apple, Samsung, or Google sells a device capable of running this new "local AI," they have to pay this supplier a royalty.


That's billions in royalties.


We don't know which way the AI market is going to move, but no matter if it stays in big centralized data centers, or moves to your smartphone… this company's profits are set to soar.


Get the name of the "Secret Supplier" owning the future of AI here.

Now don't get me wrong.

None of this means the current buildout is "wrong." It just means it may not be the final form.

Technology rarely ends up where the first wave of capital assumes it will.

So when I see someone inside the AI ecosystem openly talking about this tension — centralized versus distributed intelligence — I take note. 

Not because it invalidates what's happening today, but because it highlights how fragile consensus thinking can be.

The smartest shifts don't usually come from outside the system. They come from people inside it asking uncomfortable questions.

This is one of those questions.

And it's one we'll keep coming back to, because how AI is delivered matters just as much as how powerful it becomes.

Talk to you tomorrow.

All the best,

Simmy Adelman, Editor-in-Chief
Behind the Markets



youtube button
facebook button
instagram button

Our mailing address is:
Behind the Markets, LLC
4260 NW 1st Avenue, Suite 55
Boca Raton, FL 33431


Copyright © 2024 Behind the Markets, LLC, All rights reserved.
You're receiving this email as part of your subscription to Behind the Markets. For more information about our privacy practices, please review our Privacy Policy or our Legal Notices.

Behind the Markets


Unsubscribe


No comments:

Post a Comment