Getting started
Cycls SDK turns your AI-based code into instant web apps.
Your apps are streamed directly from your infrastructure, giving you full control over your data and deployment. Apps can seamlessly interact with each other, acting as both clients and servers in the network.
With app streaming, you can:
• Turn existing code into instant web apps
• Generate UIs on-the-fly with LLMs
• Allow apps to call and utilize each other
• Integrate with any model, framework, or infrastructure
The streaming architecture radically simplifies and accelerates the development of AI applications. Learn more in what is Cycls?.
Creating an App
In this example, the @spark
app simply echoes the user’s input by accessing message.content
string and returning it back:
from cycls import Cycls
cycls = Cycls()
@cycls("@spark")
def app(message):
return message.content
cycls.push()
The @cycls(handle)
decorator registers the app function with the unique handle @spark
.
cycls.push()
command publishes the app on cycls.com/@spark:dev in development mode.
Asynchronous Apps
For improved performance, the app function can be made asynchronous.
...
@cycls("@spark")
async def app(message):
return message.content
...
App State
Developing LLM based apps requires session-specific details like session id
, and message history
, which can be accessed as follows:
from cycls import Cycls
cycls = Cycls()
@cycls("@spark")
async def app(message):
print(message.id) # session id
print(message.history) # message history
return message.content
cycls.push()
LLM Example
Here is working implementation of Meta’s open-source llama3-70b
model (running on Groq) as a Cycls app.
from cycls import Cycls
from groq import AsyncGroq
cycls = Cycls(api_key="API_KEY")
groq = AsyncGroq(api_key="API_KEY")
async def groq_llm(x):
stream = await groq.chat.completions.create(
messages=x,
model="llama3-70b-8192",
temperature=0.5, max_tokens=1024, top_p=1, stop=None,
stream=True,
)
async def event_stream():
async for chunk in stream:
content = chunk.choices[0].delta.content
if content:
yield content
return event_stream()
@cycls("@spark")
async def groq_app(message):
history = [{"role": "system", "content": "you are a helpful assistant."}]
history += message.history
history += [{"role": "user", "content": message.content}]
return await groq_llm(history)
cycls.push()
Try it live cycls.com/@spark
Visit the cookbook for more examples.
Check out this replit video to publish a chat app in < 60 seconds.