Documentation Index
Fetch the complete documentation index at: https://docs.cycls.com/llms.txt
Use this file to discover all available pages before exploring further.
This guide shows you how to integrate the google-genai SDK into your Cycls agent to access Gemini models.
You will learn how to:
- Define dependencies for the Docker environment.
- Configure the Gemini API client inside the agent handler.
- Stream responses from Gemini back to the user.
Prerequisites
- Python 3.9+
cycls package installed
- Google AI Studio API Key
- Docker installed (for local testing)
Step 1: Create the Agent
Create a new file called app.py:
import cycls
@cycls.app(pip=["google-genai"], copy=[".env"])
async def app(context):
import os
from google import genai
# Convert messages to Gemini format
contents = [
{
'role': 'model' if m['role'] == 'assistant' else 'user',
'parts': [{'text': m['content']}]
}
for m in context.messages if m['role'] != 'system'
]
# Initialize Gemini client
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
# Stream the response
async for chunk in await client.aio.models.generate_content_stream(
model="gemini-2.5-flash",
contents=contents,
config={'system_instruction': "You are a helpful AI assistant."}
):
if chunk.text:
yield chunk.text
app.local()
Create a .env file in the same directory with your API key:
GEMINI_API_KEY=your_api_key_here
Step 3: Run the Agent
Execute your agent script:
The agent will start locally and provide an endpoint for interaction.
Full Code
Here is the complete app.py file:
import cycls
@cycls.app(pip=["google-genai"], copy=[".env"])
async def app(context):
import os
from google import genai
# Convert messages to Gemini format
contents = [
{
'role': 'model' if m['role'] == 'assistant' else 'user',
'parts': [{'text': m['content']}]
}
for m in context.messages if m['role'] != 'system'
]
# Initialize Gemini client
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
# Stream the response
async for chunk in await client.aio.models.generate_content_stream(
model="gemini-2.5-flash",
contents=contents,
config={'system_instruction': "You are a helpful AI assistant."}
):
if chunk.text:
yield chunk.text
app.local()
Deploy to Cloud
To deploy to production:
import cycls
import os
cycls.api_key = os.getenv("CYCLS_API_KEY")
@cycls.app(pip=["google-genai"], copy=[".env"])
async def app(context):
from google import genai
contents = [
{
'role': 'model' if m['role'] == 'assistant' else 'user',
'parts': [{'text': m['content']}]
}
for m in context.messages if m['role'] != 'system'
]
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
async for chunk in await client.aio.models.generate_content_stream(
model="gemini-2.5-flash",
contents=contents,
config={'system_instruction': "You are a helpful AI assistant."}
):
if chunk.text:
yield chunk.text
app.deploy()