Skip to main content

Overview

Every agent automatically exposes two streaming API endpoints:
  • OpenAI-compatible endpoint (/chat/completions) - Works with any OpenAI SDK
  • Cycls Protocol endpoint (/chat/cycls) - Native protocol with rich UI components

API Endpoints

Your agent exposes these endpoints: Local Development (app.local()):
POST http://localhost:8080/chat/completions  # OpenAI-compatible
POST http://localhost:8080/chat/cycls        # Cycls Protocol
Cloud Deployment (app.deploy()):
POST https://<app-name>.cycls.ai/chat/completions  # OpenAI-compatible
POST https://<app-name>.cycls.ai/chat/cycls        # Cycls Protocol

OpenAI-Compatible API

The /chat/completions endpoint follows the standard OpenAI chat completion format, making it compatible with any OpenAI SDK or client.

Request Format

{
  "model": "app",
  "messages": [
    {"role": "user", "content": "Hello, how are you?"}
  ],
  "stream": true
}

Using cURL

curl -X POST http://localhost:8080/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "app",
    "messages": [
      {"role": "user", "content": "Hello!"}
    ],
    "stream": true
  }'

Using the OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="not-needed",  # Use your api_token if auth=True
    base_url="http://localhost:8080"
)

response = client.chat.completions.create(
    model="app",
    messages=[
        {"role": "user", "content": "Write a poem about AI"}
    ],
    stream=True
)

for chunk in response:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

Using the OpenAI JavaScript SDK

import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: 'not-needed',
  baseURL: 'http://localhost:8080'
});

const stream = await openai.chat.completions.create({
  model: 'app',
  messages: [
    { role: 'user', content: 'Hello from JavaScript!' }
  ],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || '');
}

Cycls Protocol

The /chat/cycls endpoint uses Server-Sent Events (SSE) to stream rich UI components including thinking bubbles, code blocks, tables, and more.

Request Format

{
  "messages": [
    {"role": "user", "content": "Hello!"}
  ]
}

Response Format

The response streams SSE events with JSON payloads:
data: {"type": "text", "text": "Hello! "}
data: {"type": "text", "text": "How can I help?"}
data: {"type": "thinking", "thinking": "Processing..."}
data: {"type": "code", "code": "print('hello')", "language": "python"}
data: [DONE]

Message Structure

Assistant responses contain a parts array:
{
  "role": "assistant",
  "parts": [
    {"type": "text", "text": "Here's the answer:"},
    {"type": "thinking", "thinking": "Let me explain..."},
    {"type": "code", "code": "x = 1", "language": "python"}
  ]
}

Supported Component Types

TypeFieldsDescription
texttextMarkdown text
thinkingthinkingReasoning bubble
codecode, languageCode block
tableheaders, rowsData table
calloutcallout, style, titleAlert box
imagesrc, alt, captionImage
statusstatusProgress indicator

Authentication

Public Access (auth=False)

If your agent is public, API endpoints are open:
@cycls.app(auth=False)
async def app(context):
    yield "Hello!"

Protected Access (auth=True)

If your agent requires auth, include a Bearer token:
@cycls.app(auth=True, api_token="sk-your-token")
async def app(context):
    yield f"Hello, {context.user.name}!"
curl -X POST http://localhost:8080/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer sk-your-token" \
  -d '{
    "model": "app",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'

Next Steps

User Authentication

Learn how to secure your agent with built-in user authentication.