Jan runs AI locally for offline, private conversations. Privatemode extends Jan with frontier cloud models while keeping the same privacy guarantee through end-to-end encryption and confidential computing.
Introduction
Jan is designed for users who want full control over their AI conversations. But local models are limited by your hardware. Privatemode lets you access frontier cloud models while maintaining the privacy-first approach Jan users value.
Every message is encrypted end-to-end before leaving your machine. The Privatemode proxy handles encryption and remote attestation locally, so the AI provider never sees your conversations in plaintext. Privatemode never stores or learns from your data.
Jan has built-in support for custom OpenAI-compatible remote endpoints via the 'Any OpenAI Compatible API' option. Add Privatemode in the model settings and your chat interface works instantly with encrypted cloud models, no extensions needed.
Benefits
Integrating Privatemode into Jan brings cloud-AI capabilities to your desktop chat while keeping your conversations confidential. Every prompt and response is encrypted through end-to-end encryption and confidential computing. Privatemode never learns from your data.
Local models are constrained by your device's memory and GPU. With Privatemode, Jan users can access state-of-the-art LLMs that far exceed what local hardware can run, while keeping the same privacy guarantees.
Jan natively supports connecting to OpenAI-compatible remote endpoints. Add Privatemode in the model settings and all chat features work instantly. Your local models stay available for offline use alongside Privatemode.
How to get started
If you don't have a Privatemode API key yet, you can generate one for free here.
docker run -p 8080:8080 ghcr.io/edgelesssys/privatemode/privatemode-proxy:latest --apiKey <your-api-key>
The proxy verifies the integrity of the Privatemode service using confidential computing-based remote attestation. The proxy also encrypts all data before sending and decrypts data it receives.
FAQ
Minimal changes. Jan natively supports custom OpenAI-compatible endpoints via the 'Any OpenAI Compatible API' remote model option. You add Privatemode as a remote provider, set the endpoint URL and model name, and the chat interface works as usual with encrypted cloud models.
Integration
n8n
Add cloud-grade AI to your n8n workflows while keeping all data encrypted end-to-end through confidential computing.
Integration
PrivateGPT
Use Privatemode's encrypted AI API with PrivateGPT to chat with your documents without exposing sensitive files to any cloud provider.

Want to look for yourself?