Navigating the AI Stack: Innovation, Dependence, and the Startup Challenge
Imagine a team of young entrepreneurs working on an ambitious idea: an AI tutor designed for students in regional languages. The potential is huge. A tool like this could bring personalized learning to millions who are underserved by existing digital products. But turning this vision into reality requires climbing the full “AI stack” — the layered architecture that powers artificial intelligence.
In this example, we illustrate how the structure of the AI stack directly shapes the business realities of a startup—from costs and dependencies to long-term sustainability. Each layer offers opportunities, but also creates points of control.
Infrastructure: The Cost of Survival
At the base of the AI stack lies infrastructure - the hardware and cloud services that provide computing power. Training or fine-tuning models demands powerful GPUs, supplied by a small number of hyperscale providers. For any founder, this layer largely determines survival costs before innovation can even reach the market.
Foundation Models: Leasing the Core Intelligence
The next layer is the foundation model, the large-scale language or vision model that provides the “intelligence” of an AI product. Building one from scratch is out of reach for most startups, as it requires enormous investments in data, chips, and expertise. Instead, companies typically license models from established providers. But providers can control permissible use cases, and they set the pricing. In effect, startups are renting the very core of their product rather than owning it.
Fine-Tuning and Middleware: Innovation Shaped by Lock-In
For our AI tutor to serve classrooms effectively, it needs careful fine-tuning for local curricula and languages such as Hindi, Tamil, and Bengali. This is often done through fine-tuning APIs offered by model providers. But vertical integration can create challenges. A startup that chooses a specific cloud service provider for its infrastructure may find that the provider’s integrated AI models are more convenient to use. Accessing rival models or independent alternatives could then become harder. Innovation at this stage is shaped not only by product needs but also by the architecture of specific ecosystems.
Applications: The Fragile Tip of the Stack
At the top of the stack lies the application layer - the AI tutor itself, the product that reaches parents and students. This is where many ecosystems thrive, with a vibrant range of startups in edtech, healthtech, and fintech. Building the application layer presents unique challenges, as its success is linked to several external factors.
A Global Debate on Control and Contestability
These dynamics are not unique to one country. Around the world, regulators are grappling with questions of concentration and control within the AI stack. For example, the European Union’s AI Act introduces detailed requirements for high-risk systems, and in the United States, agencies such as the Federal Trade Commission and Department of Justice are probing the market power of cloud and AI providers.
India’s Competition Commission is also expected to publish an AI market study soon, examining similar questions of dependency and contestability. This provides an opportunity to learn from global debates while tailoring solutions to local realities.
In the upcoming posts, we will look more closely at each layer of the AI stack, and consider what India’s regulatory response should look like.