LangChain is a framework for developing applications powered by language models, and it can be integrated with Cycls to create powerful conversational applications.

Prerequisites

To be able to use Langchain, you will need an OpenAI API Key. You can get one by creating an account and heading here. Once you have a key you need to set it to a .env file:

OPENAI_API_KEY=<YOUR_API_KEY>

Integrating with Langchain

By following these steps, you can integrate LangChain with Cycls to create an asynchronous application that processes and responds to user messages using LangChain’s capabilities.

1

Set Up Your Environment

First, ensure you have the necessary packages installed. You’ll need langchain-openai, and python-dotenv for managing environment variables.

pip install python-dotenv langchain-openai
2

Import Required Modules

Import the necessary modules, including AsyncApp from Cycls, load_dotenv from dotenv, and Groq from the groq package. Also, import os to access environment variables.

from cycls import AsyncApp
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
import os
3

Load Environment Variables

Use load_dotenv to load your environment variables from a .env file. Ensure you have CYCLS_SECRET_KEY and OPENAI_API_KEY defined in your .env file.

load_dotenv()

secret = os.getenv("CYCLS_SECRET_KEY")
openai_api_key = os.getenv("OPENAI_API_KEY")
4

Initialize the Groq Client

Create an instance of the Groq client using the API key from your environment variables.

llm = ChatOpenAI(api_key=openai_api_key)
5

Initialize the AsyncApp

Create an instance of AsyncApp with the necessary parameters.

app = AsyncApp(
    secret=secret,
    handler="@your-handler",
)
6

Define the Entry Point

Define the entry point function for your application. This function will process incoming messages and use the Groq client to generate a response.

@app
async def entry_point(context):
    received_message = context.message.content.text

    response = llm.invoke(received_message)

    await context.send.text(response)

7

Publish Your App

Finally, publish your app by calling the publish method.

app.publish()
8

Run Your App

Run your app in the directory containing main.py:

python main.py

Complete Example

Here is the complete code for integrating Groq with Cycls:

from cycls import AsyncApp
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
import os

load_dotenv()

secret = os.getenv("CYCLS_SECRET_KEY")
openai_api_key = os.getenv("OPENAI_API_KEY")

app = AsyncApp(
    secret=secret,
    handler="@your-handler",
)

llm = ChatOpenAI(api_key=openai_api_key)

@app
async def entry_point(context):
    received_message = context.message.content.text

    response = llm.invoke(received_message)

    await context.send.text(response)

app.publish()