Designing Algolia's AI Agent Platform: From beta to scalable ecosystem

Designing Algolia's AI Agent Platform: From beta to scalable ecosystem

Shaped Algolia’s AI agent platform by unifying internal use cases, customer-facing AI agents, and core services like Search and Recommend into a single foundation. Now it powers AI experiences across the ecosystem.

Scope
  • Defined system architecture and UX patterns

  • Build cross-org AI design system and patterns

  • Align strategy and delivery with ML engineers, PMs, legal and and business stakeholders

  • Enable internal product teams and unlock dependencies

Strategic impact
  • Enabled a new usage-based monetization stream

  • Reduced agent creation time by 40%

  • Created AI design patterns adopted by all AI initiatives

  • Influenced 2025/26 AI design roadmap

  • Established the internal platform layer

Problems

The beta product wasn't landing

Algolia had invested in AI experience leveraging RAG, but our Agent Studio's beta wasn't picking up. Three main reasons were uncovered during discovery:

The beta was technically functional but lacked clear market traction. I led a cross-org discovery effort (combining user interviews, UX audits, and business feedback loops) to uncover the core issues blocking adoption.

🔥

High user friction: API-driven UX over user-driven flow

The UX mirrored backend logic instead of user goals. It created technical and unintuitiv UX, resulting in long setup times.

🔥

Low adoption: Blank-canvas onboarding

Users had to start from scratch with no guidance. Even common scenarios (eg. Shopping assistant) required manual setup.

🔥

Weak ROI: Unsustainable business model

  • Product was limited to premium-tier customers → Limited reach

  • Pricing was based on subscription only → No usage based growth

  • High LLM cost → scaling increased losses instead of value

A big decision

The pivot decision: Start from zero

The Agent Studio beta had proven technical feasibility but we faced its limit. I partnered with PM and engineering team to pivot. It meant scrapping the beta and rebuilding a platform that is scalable and monetizable.

✅ Rebuild 0→1

  • 8 weeks to MVP

  • Scalable foundation for entire AI suite

  • Enable a monetization model tied directly to API usage

  • Support both internal and external AI products with shared infrastructure.

❌ Fix existing component

  • Months of incremental improvements

  • Revenue ceiling

  • Limited UX improvement opportunities tied to API design

Approach #1

Building user empathy through vision hero story

The team was highly technical, but we lacked a shared understanding of who we were building for and why. I led a vision storyboard workshop by mapping “hero journey” that visualized the user’s goals, motivations and barriers.

This narrative became our north star, guiding design and technical decisions.

Approach #2

UX-driven API design

This time, before any technical work, I designed a vision user journey and information architecture. Then it became a baseline for API design. This created a clean backend to frontend mapping, increasing development speed but also ensuring we build intuitive user experience.

1

Designed new information architecture

Built vision user flows and IA around user goals and needs.

2

Defined API schema based on the new IA

Partnered with engineers to translate user-facing logic into technical object relationships and endpoints.

3

Backend implementations + UX work in parallel

Both UX and API evolved around a shared conceptual model. It also enabled synchronized delivery.

Core platform features

Building a platform that can scale across products and teams

The goal was to create a scalable foundation that connects with Algolia’s broader ecosystem. Agent Studio was designed not just for individual agent builders, but as an platform layer that other teams and external developers could build on top of.

1️⃣ Prefilled templates and canvas

The Problem: Blank canvas paralysis. Users didn't know where to start.

The Solution: Pre-built templates (e.g., shopping assistant, support agent, search companion) with optimized prompts and tools pre-selected.

2️⃣ Configure and test rapidly

The Problem: AI configuration felt abstract. Users couldn’t see how changes affected agent behavior in the same page, making iteration slow.

The Solution: A side-by-side configuration and live playground. Users adjust instructions, tools, and memory settings on the left, and instantly validate the agent’s response on the right. Technical users especially appreciated the fact that now testing is intuitive and significantly faster.

3️⃣ Ability to monitor your agent and iterate

The Problem: No visibility into agent health or usage patterns.

The Solution: Built-in monitoring dashboard showing key health metrics such as response time, search calls, and user engagement trends. Users can also review each conversations in case they need to investigate deeper.

4️⃣ Agent memory layer

The Problem: Agent lacked context. It was like chatting with someone forgets about you each time.

The Solution: Exposed memory as first-class feature. Users control retention windows, see what was remembered, and understand why agents respond as they do.

5️⃣ Tighter ecosystem integration → direct revenue impact

The Problem: Agent Studio was isolated from Algolia's core products.

The Solution: Agent Studio is now integrated with Algolia’s Search and Recommend products. Each agent run now generates Search API usage, which directly contributes to the revenue stream.

The platform has evolved from a tool to a strategic revenue driver within Algolia’s ecosystem.

Outcome and impact

4 months post-launch

  • MVP shipped in under 2 months

  • 800+ of customer-built agents

  • CSAT 4.7 out of 5

  • Generated search API calls (80k+), directly boosting revenue

  • Adopted by internal product teama, powering AI Assist experience

  • Established a platform layer that enables ecosystem integrations

Reflections

What I learned

✨ What worked

  • Early API co-design enabled us to build intuitive user experience and to move faster.

  • Vision storyboarding workshop aligned the team around the user value.

  • Thinking ecosystem from early on. Taking a step back and thinking how this platform create synergy with the rest of the product.

🪄 What I'd do differently

  • Quantitative discovery: We lacked baseline metrics. I would invest 2 weeks upfront in quantitative analysis & baselines settings before launching a new version.

  • Proactive research: Post-launch iteration was reactive in the beginning. Would implement continuous research earlier.