The 'Brownie Recipe Problem': Why AI Needs More Than Just Smarts for Real-Time Success
07 Feb, 2026
Artificial Intelligence
The 'Brownie Recipe Problem': Why AI Needs More Than Just Smarts for Real-Time Success
Large Language Models (LLMs) are undeniably impressive, capable of complex reasoning and generating human-like text. However, as Instacart CTO Anirban Kundu aptly describes, they often stumble when it comes to the messy, real-time world of contextual understanding – a challenge he terms the "brownie recipe problem." It's a fascinating insight into the gap between AI's theoretical power and its practical application, especially in fast-paced environments like grocery delivery.
Beyond the Basic Recipe: The "Brownie Recipe Problem" Explained
Imagine asking an LLM for a brownie recipe. Simple enough, right? But in a real-world scenario like Instacart, it's far more nuanced. The AI doesn't just need to know the ingredients; it needs to know what's available in your specific market, factor in your preferences (organic vs. conventional eggs, perhaps?), and crucially, consider logistics like delivery time to prevent spoilage. This multi-layered context is precisely where current LLMs, despite their reasoning prowess, can falter. Kundu highlights that for services requiring instant responses, like grocery ordering, a 15-second reasoning time is a non-starter – users will simply abandon the service.
The Balancing Act: Reasoning, State, and Personalization
Kundu explains that effective AI in this domain requires understanding two distinct "worlds": the world of reasoning (the logic and decision-making) and the world of state (the real-time, actual conditions of inventory, location, etc.). Merging these with individual user preferences presents a significant hurdle. Simply dumping all of a user's history into a monolithic LLM would create an unmanageable beast.
Instacart's ingenious solution involves a modular approach:
Foundational Model: A large model first tackles broad intent and categorizes products.
Small Language Models (SLMs): The processed data is then routed to specialized SLMs. These are designed for two critical tasks:
Catalog Context: Understanding which products complement each other and, crucially, identifying suitable substitutions when items are out of stock. Kundu notes that Instacart faces out-of-stock situations in a significant percentage of orders, making robust substitution capabilities vital.
Semantic Understanding: Grasping the meaning behind user requests. For instance, understanding what constitutes a "healthy snack for an 8-year-old" and then finding relevant products, including alternatives if the primary choices aren't available locally.
Logistics: The Cold, Hard Truths
Beyond product availability and user preference, the logistical element is paramount. The AI must consider factors like the perishability of items. An order including ice cream and frozen vegetables requires calculating a delivery window that prevents melting or thawing. This real-time logistical awareness is a complex but essential layer of context.
Microagents Over Monoliths: A Modular Future for AI
Echoing the principles of good software design, Instacart is moving away from "monolithic" AI agents that attempt to do everything. Instead, they favor a system of microagents, each focused on a specific task. This approach offers several advantages:
Modularity: Similar to the Unix philosophy, smaller, specialized tools are easier to manage and update.
Robustness: Different agents can handle the varying reliability and update cycles of third-party platforms (like point-of-sale systems and catalog databases) more effectively.
Efficiency: By segmenting tasks, the overall processing can be more streamlined.
To orchestrate these microagents, Instacart leverages standards like OpenAI's Model Context Protocol (MCP) for connecting models to data sources and Google's Universal Commerce Protocol (UCP) for interacting with merchant systems. However, Kundu cautions that the real challenge isn't just integration, but ensuring the reliability and understandability of these connections. Much of their development time is spent troubleshooting failure modes and latency issues across these varied integrations.
The Road Ahead: Context is King
The "brownie recipe problem" is a powerful metaphor for the challenges facing AI in delivering truly seamless, real-time user experiences. It underscores that raw reasoning power isn't enough. For AI to move beyond impressive demonstrations and become truly indispensable in our daily lives, it needs to master the art of understanding and integrating fine-grained, real-world context. Instacart's modular approach, combining foundational models with specialized SLMs and microagents, offers a compelling blueprint for tackling this complex challenge.