Skip to main content

Ibex AI Service

The ibex-ai-service serves as the primary gateway for all natural language and AI interactions on the platform. It handles session states and chat history so the stateless LLM engine doesn’t have to.

Architectural Role

Built with Python and FastAPI, the service runs in the AWS ecosystem on port 8010. It is exposed directly to the outside world via Traefik at paths /api/chat and /api/conversations with a routing priority of 90.

High-Level Working

When a user interacts with the AI chat in the BI UI:
  1. Authentication: The service intercepts the POST /api/chat request and validates the user’s JWT by communicating with the ibex-identity-service.
  2. State Management: It maintains conversation history (CRUD logic) ensuring that long-running threads retain context.
  3. Orchestration: It forwards the user’s prompt alongside the stored conversation context to the internal ibex-agent-engine (POST /api/agents/{name}/query).
  4. Streaming: As the underlying Node.js engine and OpenRouter LLMs process the query and execute SQL, this service channels the data back to the browser via Server-Sent Events (SSE) for a real-time typing effect.
This decouples the heavy state persistence and authentication mechanisms from the core agent execution layer.