The Application Layer

Artificial intelligence is a stack: energy, silicon, cloud, models, and applications. Each has its own economics, competitive dynamics, and challenges. Mistaking one layer for the whole industry causes confusion, misrepresentation, bad decisions, and misguided capital allocations. The infrastructure builders enable the platform; the application builders capture the value. The question is now, what value does all this deliver? Energy, silicon, cloud, and models only serve to deliver that product. There is a robust argument that we are at the beginning of an unprecedented value-creation curve. Built on the infrastructure and services provided by the other layers of the stack, the AI application layer will be globally transformative and disruptive. The constraints are imagination, execution, and the willingness to rebuild how work is done.

Capturing AI

AI models produce raw intelligence. They generate tokens. But tokens are an intermediate good, not a finished product. What customers actually pay for is legal work completed, code shipped, claims processed, research synthesized, and decisions supported.

They pay for refined output.

Attention has focused on the infrastructure layer — the frontier labs, the compute stack, and the data centers. That attention is not misplaced, but it overlooks a structural shift already underway. Once you understand the model as an intermediate good rather than the end product, the center of gravity moves. The decisive question is no longer who can produce intelligence, but who can turn it into something usable, trusted, repeatable, and economically defensible.

In other words, who can refine it into a usable product?

At the base of the chain sit the token producers — OpenAI, Anthropic, Google DeepMind, Meta, DeepSeek, and Qwen. They produce raw capability. This layer is expensive to build, technically formidable, and still moving fast. But crude oil is not gasoline.

Enterprises and consumers pay for gasoline.

Distributed Machine Learning Can Bring Healthcare Breakthroughs

Over the last decade, the dramatic rise of deep learning has led to stunning transformations in dozens of industries. It has powered our pursuit of self-driving cars, fundamentally changed the way we interact with our devices, and reinvented our approach to cybersecurity.

In health care, however, despite many studies showing its promise for detecting and diagnosing diseases, progress in using deep learning to help real patients has been agonizingly slow. All this could change with distributed learning.