LangChain is a powerful framework for developing applications powered by language models. It simplifies the process of chaining together different components like LLMs, prompts, and memory. This guide shows you how to combine LangChain’s flexibility with Cycls’ easy deployment to create robust AI agents.Documentation Index
Fetch the complete documentation index at: https://docs.cycls.com/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- Python 3.9+
cyclspackage installed- Docker installed (for local testing)
- OpenAI API key
Note: This guide uses OpenAI, but LangChain and Cycls support many providers including Anthropic, Google Gemini, Mistral, Cohere, and more. Simply swap the pip dependency and model name.
Step 1: Create the Agent
Create a new file calledapp.py:
Step 2: Configure Environment
Create a.env file in the same directory to store your API key:
Step 3: Run the Agent
Execute your agent:Full Code
Here is the completeapp.py file:
Using Other LLM Providers
Swap the dependency and model name to use a different provider:| Provider | Pip Package | Model Example |
|---|---|---|
| OpenAI | langchain-openai | gpt-4o |
| Anthropic | langchain-anthropic | claude-sonnet-4-5-20250929 |
langchain-google-genai | gemini-3.0-pro | |
| Mistral | langchain-mistralai | mistral-large-latest |
Deploy to Cloud
To deploy to production, set your Cycls API key and callapp.deploy():