⚙ Kubernetes-Native · Event-Driven · Multi-Agent

SAIL
Serverless AI Layer

Where AI Agents Meet Kubernetes Infrastructure

Define collaborative AI agents as Kubernetes custom resources. SAIL wires them into event-driven pipelines — no routing code, no service discovery, no boilerplate. Just prompts and Kubernetes.

Why SAIL?

Building multi-agent AI systems is complex. Routing, discovery, scaling, retries — all of that is infrastructure work, not AI work. SAIL absorbs it.

📜

Prompts, Not Plumbing

Write a YAML with a system prompt and a user prompt. SAIL's operator automatically creates the Knative Service, Trigger, and Redis registry entry. You never write routing code.

🔗

Agents Discover Each Other

At runtime, each agent queries Redis and learns the names and descriptions of every other deployed agent. The LLM decides who to call next — just like a team of engineers in Slack.

Serverless Scale-to-Zero

Built on Knative Serving. Idle agents consume zero resources. Messages queue in Kafka; agents spin up on demand. Pay only for what you use, even at enterprise scale.

🌊

Event-Driven by Default

Every message is a CloudEvent routed through the Knative Broker. Human inputs, agent outputs, and system triggers are all the same format. Add new agents without touching existing ones.

🔒

Kubernetes-Native Security

No secrets in code. Kubernetes Secrets inject the OpenAI key, Redis host, and Kafka TLS certificates. RBAC, namespaces, and network policies work exactly as you expect.

🛠

Bring Your Own Agent

Set agentClassName to extend the base runtime with custom Java logic. SAIL derives the container image automatically from the class name. Custom ≠ complex.

How It Works

SAIL orchestrates three independent components and a cloud-native messaging backbone. Messages flow from Kafka through Knative into agent pods and back — fully automated.

STEP 01

Operator provisions

Watches GenericAgent CRs. Creates Knative Service, Trigger, and Redis entry.

STEP 02

Message arrives

JSON posted to Kafka is consumed by KafkaSource, transformed into a CloudEvent.

STEP 03

Trigger routes

Knative Broker checks the targetagent attribute and delivers to the right service.

STEP 04

Agent thinks

Renders prompt, discovers peers from Redis, calls OpenAI via LangChain4j.

STEP 05

Loop continues

LLM calls sendMessage MCP tool → MCP server publishes to Kafka → next agent activates.

Three Independent Components

Each subproject is fully standalone — no shared modules, no parent POM. Deploy only what you need.

sail-operator

Kubernetes Operator

Watches GenericAgent custom resources (CRD group sebi.org/v1) and provisions a Knative Service + Trigger for each agent using server-side apply. Stores agent metadata in Redis for peer discovery. Uses a finalizer to clean up Redis keys on deletion.

sail-mcp-server

MCP Tool Server

Stateless HTTP/SSE server exposing a single sendMessage tool via the Model Context Protocol. All agents call this shared endpoint to publish SailMessage objects to the agents-messages Kafka topic.

sail-base-openid

Agent Runtime

Receives CloudEvents from Knative, renders Qute prompt templates with SailMessage context, queries Redis for available agents, then calls OpenAI via LangChain4j. The LLM uses MCP tools to route messages onward.

Built On Technologies You Know

Quarkus — Fast JVM / native runtime Knative Serving + Eventing Apache Kafka — Message backbone LangChain4j — LLM abstraction OpenAI API Redis — Agent registry MCP — Model Context Protocol Java 21 — Virtual threads ready JOSDK — Java Operator SDK Qute — Prompt templating Strimzi — In-cluster Kafka Kubernetes 1.28+

~10 Lines of YAML. That's an Agent.

No code required for standard agents. Write a prompt, give it a name, apply it to Kubernetes — SAIL handles the rest.

apiVersion: sebi.org/v1
kind: GenericAgent
metadata:
  name: creative-writer-agent
  namespace: sail
spec:
  description: "Generates short creative story drafts"
  systemMessage: |
    You are a creative writer called
    'creative-writer-agent'.
  userMessage: |
    Generate a 3-sentence story about:
    {sailMessage.inputs['creative-writer-agent']}

    Send the result to an audience editor agent
    using the sendMessage tool. Include the
    original inputs {sailMessage.inputs}.

What Each Field Does

  • description Stored in Redis. Other agents read this to learn what this agent does.
  • systemMessage The LLM's persona and role. Plain text — no templating needed here.
  • userMessage A Qute template. References sailMessage — the incoming message, inputs, and sender name.
  • inputs['key'] Each agent reads its own named input from the shared inputs map passed through the pipeline.
  • sendMessage tool The LLM is instructed to call the MCP tool when it's ready to pass results downstream.
Optional: Set agentClassName to a fully-qualified Java class to use a custom container image. SAIL derives the image name automatically.

Up and Running in 4 Steps

From zero to a running 3-agent pipeline in under 10 minutes (assuming you have a Kubernetes cluster).

01

Install the cluster stack

One script sets up minikube with Knative Serving, Knative Eventing, Strimzi Kafka, and Redis — fully automated for local dev.

./scripts/install-knative-dev.sh
# Then in a separate terminal:
minikube tunnel --profile sail
02

Install SAIL resources

Applies the CRD, operator, MCP server, and all eventing resources in the correct order, waiting for each to become ready.

./scripts/install-sail-resources.sh \
  --skip-tls \
  sail-kafka-kafka-bootstrap.kafka.svc.cluster.local:9092
03

Create your first agents

Apply the sample 3-agent pipeline: creative writer → audience editor → style editor.

cd sail-operator && make agents-add
# Or apply individually:
kubectl apply -f sail-operator/sample/agents/creative-writer-agent.yaml
04

Trigger the pipeline

Publish a message to the creative-writer-agent via Kafka. Watch all three agents collaborate automatically.

kafka-console-producer.sh \
  --bootstrap-server <bootstrap-server> \
  --topic agents-messages \
  --property parse.headers=true \
  --property headers=content-type:application/json \
  < sail-operator/sample/sail-messages/human-to-creative-writer.json
Read the Full Quickstart Guide →

Ship Your First AI Pipeline Today

SAIL is open source, Kubernetes-native, and designed for engineers who want powerful multi-agent AI without the infrastructure overhead.