Cycls SDK turns your AI based code into instant web apps. Your apps are streamed directly from your infrastructure, giving you full control over your data and deployment.

Apps can also seamlessly interact with each other as agents.

With app streaming, you can:

• Turn existing code into instant web apps

• Generate UIs on-the-fly with LLMs

• Enable agency by letting apps talk to each other

• Integrate with any model, framework, or infrastructure

The streaming approach radically simplifies and accelerates the development of AI products.


pip install cycls

Creating an App

In this example, the @spark app simply echoes the user’s input by accessing message.content string and returning it back:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark")
def app(message):
    return message.content

cycls.push()

The @cycls(handle) decorator registers the app function with the unique handle @spark.

cycls.push() command publishes the app on cycls.com/@spark:dev in development mode.

Pick a unique handle, as Cycls maintains a global namespace for handle names

Asynchronous Apps

For improved performance, the app function can be made asynchronous.

...
@cycls("@spark")
async def app(message):
    return message.content + "from spark"
...

App State

Developing LLM based apps requires session-specific details like session id, and message history, which can be accessed as follows:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark")
async def app(message):
    print(message.id)      # session id
    print(message.history) # message history
    return message.content + "from spark"

cycls.push()

LLM Example

Here is working implementation of Meta’s open-source llama3-70b model (running on Groq) as a Cycls app.

from cycls import Cycls
from groq import AsyncGroq

cycls = Cycls()
groq = AsyncGroq(api_key="API_KEY")

async def groq_llm(x):
    stream = await groq.chat.completions.create(
        messages=x,
        model="llama3-70b-8192",
        temperature=0.5, max_tokens=1024, top_p=1, stop=None, 
        stream=True,
    )
    async def event_stream():
        async for chunk in stream:
            content = chunk.choices[0].delta.content
            if content:
                yield content
    return event_stream()

@cycls("@groq")
async def groq_app(message):
    history = [{"role": "system", "content": "you are a helpful assistant."}]
    history +=  message.history
    history += [{"role": "user", "content": message.content}]
    return await groq_llm(history)
  
cycls.push()

Try it live cycls.com/@groq

Visit the cookbook for more examples.

Agents (beta)

Cycls SDK allows you to call any public app as an agent within your own app. This interoperability expands your app’s capabilities by integrating functionality from other apps. For more apps to explore, see the explore page.

In this example, we’ll create an app that invokes another public app, @groq:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark")
async def app(message):
    return cycls.call("@groq", message.content)

cycls.push()

Try it live