Integrating Privatemode AI with Jan

Jan runs AI locally for offline, private conversations. Privatemode extends Jan with frontier cloud models while keeping the same privacy guarantee through end-to-end encryption and confidential computing.

jan ai logo

Introduction

Use cloud AI in Jan without compromising your privacy

screenshot of privatemode setup in jan ai

Offline-first Setup, now with cloud power

Jan is designed for users who want full control over their AI conversations. But local models are limited by your hardware. Privatemode lets you access frontier cloud models while maintaining the privacy-first approach Jan users value.

Your conversations stay encrypted

Every message is encrypted end-to-end before leaving your machine. The Privatemode proxy handles encryption and remote attestation locally, so the AI provider never sees your conversations in plaintext. Privatemode never stores or learns from your data.

Native remote model support

Jan has built-in support for custom OpenAI-compatible remote endpoints via the 'Any OpenAI Compatible API' option. Add Privatemode in the model settings and your chat interface works instantly with encrypted cloud models, no extensions needed.

Benefits

Why use Privatemode AI with Jan?

End-to-end encrypted AI conversations

Integrating Privatemode into Jan brings cloud-AI capabilities to your desktop chat while keeping your conversations confidential. Every prompt and response is encrypted through end-to-end encryption and confidential computing. Privatemode never learns from your data.

Access frontier models beyond local limits

Local models are constrained by your device's memory and GPU. With Privatemode, Jan users can access state-of-the-art LLMs that far exceed what local hardware can run, while keeping the same privacy guarantees.

Built-in remote model support

Jan natively supports connecting to OpenAI-compatible remote endpoints. Add Privatemode in the model settings and all chat features work instantly. Your local models stay available for offline use alongside Privatemode.

How to get started

How to set up Privatemode in Jan

Privatemode register widget

Get your API key


If you don't have a Privatemode API key yet, you can generate one for free here.

docker run -p 8080:8080 ghcr.io/edgelesssys/privatemode/privatemode-proxy:latest --apiKey <your-api-key>

Run the Privatemode proxy


The proxy verifies the integrity of the Privatemode service using confidential computing-based remote attestation. The proxy also encrypts all data before sending and decrypts data it receives.

Configure Jan to use Privatemode

FAQ

Frequently asked questions about using Privatemode with Jan

Minimal changes. Jan natively supports custom OpenAI-compatible endpoints via the 'Any OpenAI Compatible API' remote model option. You add Privatemode as a remote provider, set the endpoint URL and model name, and the chat interface works as usual with encrypted cloud models.

Integrations

View more

Explore other Privatemode integrations

Integration

n8n

Add cloud-grade AI to your n8n workflows while keeping all data encrypted end-to-end through confidential computing.

Read guide
n8n logo

Integration

PrivateGPT

Use Privatemode's encrypted AI API with PrivateGPT to chat with your documents without exposing sensitive files to any cloud provider.

Read guide
PrivateGPT logo

Talk to an expert

Have a questions about Privatemode? Let's talk!

Want to look for yourself?