Cycls SDK turns your code into AI apps with one simple function. Your apps are streamed directly from your infrastructure, giving you full control over your data and deployment.

These apps can also be called as agents within other apps, offering native interoperability.

With streaming, you can:

• Access and share apps online

• Generate intuitive user interfaces

• Call apps as agents within your code

• Use any model, framework, or infrastructure

The streaming approach radically simplifies and accelerates the development cycles of AI apps and agents.


Creating an App

In this example, the app simply responds with the user’s input followed by “from spark”. This is achieved with just one function:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark") # unique handle name
def app(x):
    return x.content + "from spark"

cycls.push()

The cycls.push() command publishes the app @spark:dev on cycls.com/@spark:dev in development mode. Remember to choose a unique app name, as Cycls maintains a global namespace for handles.

Asynchronous Apps

For improved performance, the function can be made asynchronous.

...
@cycls("@spark")
async def app(x):
    return x.content + "from spark"
...

By using async, the app can handle requests concurrently, which is crucial for high-demand performance.

App State

Developing AI apps requires session-specific details like session id and message history, which can be accessed as follows:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark")
async def app(x):
    print(x.id)      # session id
    print(x.history) # message history
    return x.content + "from spark"

cycls.push()

LLM Example

Here is a complete example of Meta’s open-source llama3-70b LLM model running on Groq as a Cycls app.

from cycls import Cycls
from groq import AsyncGroq

cycls = Cycls()
groq = AsyncGroq(api_key="API_KEY")

async def groq_llm(x):
    stream = await groq.chat.completions.create(
        messages=x,
        model="llama3-70b-8192",
        temperature=0.5, max_tokens=1024, top_p=1, stop=None, 
        stream=True,
    )
    async def event_stream():
        async for chunk in stream:
            content = chunk.choices[0].delta.content
            if content:
                yield content
    return event_stream()

@cycls("@groq")
async def groq_app(message):
    history = [{"role": "system", "content": "you are a helpful assistant."}]
    history +=  message.history
    history += [{"role": "user", "content": message.content}]
    return await groq_llm(history)
  
cycls.push()

Try it live cycls.com/@groq

Visit the cookbook for more examples.

Agents (WIP)

Cycls SDK allows you to call any public app as an agent within your own app. This interoperability expands your app’s capabilities by integrating functionality from other apps. For more apps to explore, see the explore page.

In this example, we’ll create an app that invokes another public app, @groq:

from cycls import Cycls

cycls = Cycls()

@cycls("@spark")
async def app(x):
    return cycls.call("@groq", x.content)

cycls.push()

Try it live