Getting Started
- Self-host Dify using Docker Compose:
docker compose up -dfrom the cloned repository, or sign up for Dify Cloud. - Configure your LLM provider API keys in the settings panel to connect Claude, GPT-4, or other models.
- Create a new app using the visual workflow builder, choosing from chatbot, text generator, or agent templates.
- Add knowledge bases for RAG by uploading documents, and publish your app with a shareable URL or API endpoint.
Key Features
- Visual workflow builder enables drag-and-drop construction of complex LLM pipelines with branching and iteration.
- Built-in RAG engine with document ingestion, chunking, embedding, and retrieval for knowledge-grounded responses.
- Multi-model support connects to hundreds of LLM providers and lets you switch models without changing app logic.
- Agent mode builds autonomous agents with tool access, code execution, and web browsing capabilities.
- One-click deployment publishes apps as web interfaces, chat widgets, or API endpoints instantly.
- Self-hostable with full Docker support and enterprise features for data privacy and access control.
// related tools
CrewAI
AI / Agents & Automation
Framework for orchestrating autonomous AI agent teams
oss
web git
LangChain
AI / Agents & Automation
Framework for building LLM-powered applications and agents
oss
web git
n8n
AI / Agents & Automation
Open-source workflow automation platform with AI agent capabilities
oss
web git