AI & Machine Learning
Instacart's context problem: why LLMs can't nail grocery delivery in under a second
Real-time ordering demands sub-second responses, but LLMs struggle with the context needed for personalized grocery delivery. Instacart CTO Anirban Kundu explains why chunking data and deploying microagents beats monolithic systems when latency matters.