This guide walks you through integrating OpenAI with Cycls to create an asynchronous application that can leverage OpenAI’s powerful language models for tasks such as text generation, summarization, and more.

Prerequisites

To be able to use OpenAI’s API, you will need an API Key. You can get one by creating an account and heading here. Once you have a key you need to set it to a .env file:

OPENAI_API_KEY=<your_openai_api_key>

Integrating with OpenAI

By following these steps, you can integrate OpenAI with Cycls to create an asynchronous application that processes and responds to user messages using OpenAI’s API.

1

Set Up Your Environment

First, install the necessary dependencies. Make sure you have the dotenv package to load environment variables and the openai package for interacting with the Groq API.

pip install openai python-dotenv
2

Import Required Modules

Import the necessary modules, including AsyncApp from Cycls, load_dotenv from dotenv, and openai for interacting with the OpenAI API. Also, import os to access environment variables.

from cycls import AsyncApp
from dotenv import load_dotenv
from openai import OpenAI
import os
3

Load Environment Variables

Use load_dotenv to load your environment variables from a .env file. Ensure you have CYCLS_SECRET_KEY and OPENAI_API_KEY defined in your .env file.

load_dotenv()

secret = os.getenv("CYCLS_SECRET_KEY")
openai_api_key = os.getenv("OPENAI_API_KEY")
4

Initialize the OpenAI Client

Create an instance of the OpenAI client using the API key from your environment variables.

client = OpenAI(api_key=openai_api_key)
5

Initialize the AsyncApp

Create an instance of AsyncApp with the necessary parameters.

app = AsyncApp(
    secret=secret,
    handler="@your-handler",
)
6

Define the Entry Point

Define the entry point function for your application. This function will process incoming messages and use the OpenAI API to generate a response.

@app
async def entry_point(context):
    received_message = context.message.content.text

    response = client.chat.completions.create(
      model="gpt-3.5-turbo",
      messages=[
        {"role": "user", "content": received_message }
      ]
    )

    await context.send.text(response.choices[0].message)
7

Publish Your App

Finally, publish your app by calling the publish method.

app.publish()
8

Run Your App

Run your app in the directory containing main.py:

python main.py

Complete Example

Here is the complete code for integrating OpenAI with Cycls:

from cycls import AsyncApp
from dotenv import load_dotenv
from openai import OpenAI
import os

load_dotenv()

secret = os.getenv("CYCLS_SECRET_KEY")
openai_api_key = os.getenv("OPENAI_API_KEY")

client = OpenAI(api_key=openai_api_key)

app = AsyncApp(
    secret=secret,
    handler="@your-handler",
)

@app
async def entry_point(context):
    received_message = context.message.content.text

    response = client.chat.completions.create(
      model="gpt-3.5-turbo",
      messages=[
        {"role": "user", "content": received_message }
      ]
    )

    await context.send.text(response.choices[0].message)

app.publish()