Documentation Index
Fetch the complete documentation index at: https://docs.cycls.com/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Every agent automatically exposes two streaming API endpoints:
- OpenAI-compatible endpoint (
/chat/completions) - Works with any OpenAI SDK
- Cycls Protocol endpoint (
/chat/cycls) - Native protocol with rich UI components
API Endpoints
Your agent exposes these endpoints:
Local Development (app.local()):
POST http://localhost:8080/chat/completions # OpenAI-compatible
POST http://localhost:8080/chat/cycls # Cycls Protocol
Cloud Deployment (app.deploy()):
POST https://<app-name>.cycls.ai/chat/completions # OpenAI-compatible
POST https://<app-name>.cycls.ai/chat/cycls # Cycls Protocol
OpenAI-Compatible API
The /chat/completions endpoint follows the standard OpenAI chat completion format, making it compatible with any OpenAI SDK or client.
{
"model": "app",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
],
"stream": true
}
Using cURL
curl -X POST http://localhost:8080/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "app",
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": true
}'
Using the OpenAI Python SDK
from openai import OpenAI
client = OpenAI(
api_key="not-needed", # Use your api_token if auth=True
base_url="http://localhost:8080"
)
response = client.chat.completions.create(
model="app",
messages=[
{"role": "user", "content": "Write a poem about AI"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Using the OpenAI JavaScript SDK
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'not-needed',
baseURL: 'http://localhost:8080'
});
const stream = await openai.chat.completions.create({
model: 'app',
messages: [
{ role: 'user', content: 'Hello from JavaScript!' }
],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
Cycls Protocol
The /chat/cycls endpoint uses Server-Sent Events (SSE) to stream rich UI components including thinking bubbles, code blocks, tables, and more.
{
"messages": [
{"role": "user", "content": "Hello!"}
]
}
The response streams SSE events with JSON payloads:
data: {"type": "text", "text": "Hello! "}
data: {"type": "text", "text": "How can I help?"}
data: {"type": "thinking", "thinking": "Processing..."}
data: {"type": "code", "code": "print('hello')", "language": "python"}
data: [DONE]
Message Structure
Assistant responses contain a parts array:
{
"role": "assistant",
"parts": [
{"type": "text", "text": "Here's the answer:"},
{"type": "thinking", "thinking": "Let me explain..."},
{"type": "code", "code": "x = 1", "language": "python"}
]
}
Supported Component Types
| Type | Fields | Description |
|---|
text | text | Markdown text |
thinking | thinking | Reasoning bubble |
code | code, language | Code block |
table | headers, rows | Data table |
callout | callout, style, title | Alert box |
image | src, alt, caption | Image |
status | status | Progress indicator |
Authentication
Public Access (auth=False)
If your agent is public, API endpoints are open:
@cycls.app(auth=False)
async def app(context):
yield "Hello!"
Protected Access (auth=True)
If your agent requires auth, include a Bearer token:
@cycls.app(auth=True, api_token="sk-your-token")
async def app(context):
yield f"Hello, {context.user.name}!"
curl -X POST http://localhost:8080/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-your-token" \
-d '{
"model": "app",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": true
}'
Next Steps
User Authentication
Learn how to secure your agent with built-in user authentication.