Getting Started
- Download Jan from jan.ai for macOS, Windows, or Linux and install it.
- Open the model hub within Jan and download a recommended model suited to your hardware capabilities.
- Start a new conversation and begin chatting with fully private, offline AI — no data leaves your device.
- Explore the extension system to add features like RAG, API connectivity, and remote model support.
Key Features
- 100% offline operation runs entirely on your local machine with no internet connection required after model download.
- Privacy-first design ensures no conversation data is ever sent to external servers or telemetry services.
- Built-in model hub provides one-click downloads of popular models optimized for various hardware configurations.
- Extension system adds functionality through plugins for RAG, remote models, TensorRT acceleration, and more.
- OpenAI-compatible API serves your local models through a familiar API interface for application integration.
- Cross-platform and open-source available on all major desktop platforms with fully open-source code under AGPLv3.
// related tools
LM Studio
AI / Local & Self-Hosted
Desktop app for running local LLMs with a clean GUI
free
web
Ollama
AI / Local & Self-Hosted
Run large language models locally with a simple CLI
oss
web git
Aider
AI / AI Coding Tools
Terminal-based AI pair programmer that edits code in your git repo
oss
web git