What You Need
For the automated local dev path (recommended), you need the tools below. For an existing cluster, see the Knative Installation Guide.
| Tool | Version | Notes |
|---|---|---|
minikube | latest | Local Kubernetes cluster |
kubectl | 1.28+ | Kubernetes CLI |
| Java | 21+ | Required to build from source |
| OpenAI API key | — | For LLM inference in agents |
| 4 CPUs + 8 GB RAM | — | Minimum for the minikube VM |
Install the Cluster Stack
The provided script automates a complete local development environment using minikube. It installs Knative Serving, Knative Eventing, Strimzi Kafka, Redis, and all the required Knative Kafka components.
chmod +x scripts/install-knative-dev.sh
./scripts/install-knative-dev.sh
The script performs these steps (no interaction required):
- Creates a minikube cluster named
sail(4 CPUs, 8 GB RAM) - Installs Knative Serving CRDs and core (
v1.21.1) - Installs Kourier as the networking layer
- Configures
sslip.iomagic DNS - Installs Knative Eventing CRDs and core
- Installs the Kafka controller (
eventing-kafka-broker) - Installs Kafka data planes (Broker, KafkaSource, KafkaSink, KafkaChannel)
- Installs Strimzi Kafka operator and creates a single-node cluster
- Installs a single-pod Redis and creates the
sail-redis-hostsecret - Verifies all components are ready
After the script completes, open a new terminal and keep the tunnel running:
minikube tunnel --profile sail
Install SAIL Resources
The install-sail-resources.sh script installs all SAIL components
onto your cluster in the correct order and waits for each to become ready.
It applies: the CRD, the operator deployment, the MCP server, and all Knative
eventing resources (KafkaSource → EventTransform → Broker → Trigger).
Local dev (in-cluster Strimzi, no TLS)
chmod +x scripts/install-sail-resources.sh
./scripts/install-sail-resources.sh \
--skip-tls \
sail-kafka-kafka-bootstrap.kafka.svc.cluster.local:9092
External Kafka with mTLS (e.g. Aiven)
# First create the TLS secret:
kubectl create secret generic kafka-tls \
--namespace sail \
--from-file=ca.pem=ca.pem \
--from-file=service.cert=service.cert \
--from-file=service.key=service.key
# Then install SAIL:
./scripts/install-sail-resources.sh \
my-kafka.aivencloud.com:18981
Create the required secrets
SAIL needs two Kubernetes secrets in the sail namespace.
Create them before or after running the install script — the operator
will pick them up when it starts.
# OpenAI API key
kubectl create secret generic openai-api-key \
--namespace sail \
--from-literal=OPENAI_API_KEY=sk-...
# Redis host (auto-created by install-knative-dev.sh for local dev)
kubectl create secret generic sail-redis-host \
--namespace sail \
--from-literal=QUARKUS_REDIS_HOSTS=redis://<host>:6379
install-knative-dev.sh script creates the
sail-redis-host secret automatically. You only need to create
openai-api-key manually.
Create Your First Agents
SAIL ships with a sample 3-agent creative writing pipeline:
- creative-writer-agent — Generates a short story draft
- audience-editor-agent — Adapts the story for a target audience
- style-editor-agent — Applies a style tone to the final draft
Apply all three agents at once
cd sail-operator
make agents-add
Or apply agents individually
kubectl apply -f sail-operator/sample/agents/creative-writer-agent.yaml
kubectl apply -f sail-operator/sample/agents/audience-editor-agent.yaml
kubectl apply -f sail-operator/sample/agents/style-editor-agent.yaml
Verify agents are running
kubectl get genericagents -n sail
# NAME AGE
# creative-writer-agent 30s
# audience-editor-agent 30s
# style-editor-agent 30s
kubectl get ksvc -n sail
# Each agent gets a Knative Service named <agent-name>-svc
What an agent YAML looks like
apiVersion: sebi.org/v1
kind: GenericAgent
metadata:
name: creative-writer-agent
namespace: sail
spec:
description: "Generates short creative story drafts"
systemMessage: You are a creative writer called 'creative-writer-agent'.
userMessage: |
Generate a draft of a story no more than 3 sentences long
around the topic: {sailMessage.inputs['creative-writer-agent']}
Send the story to an audience editor agent using the sendMessage tool.
Include the original inputs {sailMessage.inputs} in the SailMessage.
| Field | Required | Description |
|---|---|---|
description | Recommended | Stored in Redis; used by the LLM to discover this agent |
systemMessage | Yes | System prompt for the LLM |
userMessage | Yes | Qute template; receives sailMessage as context |
agentClassName | No | Fully-qualified Java class for a custom image |
useMemory | No | Retain conversation history across calls |
Trigger the Pipeline
Send a message to creative-writer-agent by publishing to the
agents-messages Kafka topic. The sample payload includes inputs
for all three agents — each agent reads its own entry by name.
Sample message payload
{
"to": "creative-writer-agent",
"from": "human",
"inputs": {
"creative-writer-agent": "landing on the moon",
"audience-editor-agent": "kids",
"style-editor-agent": "serious"
}
}
Publish via kafka-console-producer
kafka-console-producer.sh \
--bootstrap-server <bootstrap-server> \
--topic agents-messages \
--producer.config client.properties \
--property parse.headers=true \
--property headers=content-type:application/json \
< sail-operator/sample/sail-messages/human-to-creative-writer.json
For local dev with in-cluster Strimzi, use the bootstrap address from your
minikube cluster: sail-kafka-kafka-bootstrap.kafka.svc.cluster.local:9092
(or port-forward it to localhost).
Watch the pipeline run
# Watch all agent pods activate
kubectl get pods -n sail -w
# Stream logs from a specific agent
kubectl logs -n sail -l serving.knative.dev/service=creative-writer-agent-svc -f
Understanding the Message Flow
Every message follows this exact path through SAIL's infrastructure. Understanding it will help you debug, extend, and optimize your pipelines.
human-to-creative-writer.json
│
▼ (POST to Kafka topic "agents-messages")
┌──────────────┐
│ KafkaSource │ Converts Kafka record to CloudEvent
└──────┬───────┘
│
▼
┌──────────────────┐
│ EventTransform │ JSONata extracts data.to
│ │ Sets CE attribute: targetagent=creative-writer-agent
└────────┬─────────┘
│
▼
┌──────────────────┐
│ Knative Broker │ agent-broker
└────────┬─────────┘
│ Routes to matching Trigger
▼
┌──────────────────────────────────────────────┐
│ Trigger: creative-writer-agent │
│ filter: targetagent=creative-writer-agent │
└──────────────────┬───────────────────────────┘
│
▼
┌───────────────────────────────────────┐
│ Knative Service: creative-writer-svc │
│ │
│ 1. Receives CloudEvent │
│ 2. Deserializes SailMessage │
│ 3. Renders Qute prompt template │
│ 4. Queries Redis: GET genericagent:* │
│ → learns about audience-editor, │
│ style-editor │
│ 5. Calls OpenAI via LangChain4j │
│ 6. LLM calls sendMessage( │
│ to="audience-editor-agent", │
│ payload="Once upon a time..." │
│ ) │
└──────────────────┬────────────────────┘
│
▼
┌───────────────────┐
│ sail-mcp-server │ Publishes new SailMessage to Kafka
└───────────────────┘
│
▼ (loop repeats for audience-editor, then style-editor)
agents-messages Kafka topic
Prompt Templating with Qute
The userMessage field is a
Qute template
evaluated at runtime with the incoming SailMessage as context:
{sailMessage.payload} → the message text or story content
{sailMessage.from} → sender name (e.g. "creative-writer-agent")
{sailMessage.to} → this agent's name
{sailMessage.inputs['some-key']} → a named input value
{sailMessage.inputs} → all inputs as a map (passed through pipeline)
Agent Discovery via Redis
When the operator reconciles a GenericAgent CR, it writes:
KEY: genericagent:sail:creative-writer-agent
VALUE: {"name":"creative-writer-agent","description":"Generates short creative story drafts"}
At runtime, each agent reads all genericagent* keys from Redis and
appends the list to its prompt. The LLM sees which agents are available and decides
who to route to next — no hardcoded routing, no service mesh, no configuration.
What to Build Next
Write your own agent
Copy any sample YAML, change the name, description,
systemMessage, and userMessage fields, and apply it.
Your agent will automatically appear in the Redis registry and be discoverable
by every other agent in the namespace.
Build a custom agent image
Extend the base runtime with your own Java class. Set agentClassName
in the CR and SAIL will derive the container image name automatically:
spec:
agentClassName: org.sebi.MyCustomAgent
# SAIL derives image: ghcr.io/sebi/mycustomagent:latest
Add memory to an agent
Enable conversation history retention across invocations:
spec:
useMemory: true
systemMessage: You are a research assistant with memory.
userMessage: "Answer: {sailMessage.payload}"
Use an external Kafka cluster
See the Kafka SSL/TLS Configuration guide for connecting SAIL to a managed Kafka service (e.g. Aiven, Confluent).
Deploy to production
The only change for production is pointing to your real cluster, creating the secrets with production credentials, and applying the SAIL resources. Knative handles auto-scaling; Kafka handles durability and backpressure.
- Knative Installation Guide — minikube, YAML-based, or Quickstart plugin
- SAIL Resources Deep-Dive — every Kubernetes resource explained
- Full README — complete reference documentation
Secrets Reference
All credentials are supplied via Kubernetes Secrets — never hardcoded in manifests or code.
| Secret Name | Namespace | Key(s) | Used by |
|---|---|---|---|
openai-api-key |
sail |
OPENAI_API_KEY |
Agent runtime (injected by operator) |
sail-redis-host |
sail |
QUARKUS_REDIS_HOSTS |
Operator + agent runtime |
kafka-tls |
sail |
ca.pem, service.cert, service.key |
KafkaSource (TLS) |
broker-secret |
knative-eventing |
protocol, ca.crt, user.crt, user.key |
Knative Kafka Broker |
agents-kafka-secrets |
sail |
keystore, trustore |
Agent runtime (Kafka mTLS) |