AnythingLLM turns documents into AI-powered workspaces, but cloud LLM providers see every query in plaintext. Privatemode encrypts all workspace data end-to-end while giving you access to state-of-the-art models.

Introduction
AnythingLLM lets you build custom AI workspaces for document Q&A, agents, and RAG. Cloud LLM providers deliver the best results, but they process your documents and queries in plaintext. For sensitive business data, that's a dealbreaker.
Privatemode provides an OpenAI-compatible endpoint that AnythingLLM connects to via the Generic OpenAI provider. Every document chunk, embedding, and chat message is encrypted end-to-end. The AI provider never sees your workspace data in plaintext.
Based on confidential computing, Privatemode processes your data inside a hardware-enforced enclave. This is verifiable through remote attestation and open-source code, going beyond standard cloud providers' privacy policies.
Benefits
Integrating Privatemode into AnythingLLM brings cloud-AI capabilities to your document workspaces while keeping your data confidential. Every query, document chunk, and agent action is encrypted through end-to-end encryption and confidential computing. Privatemode never learns from your data.
With Privatemode, you can choose from state-of-the-art LLMs to power AnythingLLM's chat, RAG, and agent features, all running confidentially inside hardware-enforced enclaves.
AnythingLLM has built-in support for custom OpenAI-compatible providers via the Generic OpenAI option. Set the Base URL to your Privatemode proxy and all workspace features work instantly, no plugins needed.
How to get started
If you don't have a Privatemode API key yet, you can generate one for free here.
docker run -p 8080:8080 ghcr.io/edgelesssys/privatemode/privatemode-proxy:latest --apiKey <your-api-key>
The proxy verifies the integrity of the Privatemode service using confidential computing-based remote attestation. The proxy also encrypts all data before sending and decrypts data it receives.

Download AnythingLLM from the official website and open the application on your computer.
FAQ
No. AnythingLLM supports custom OpenAI-compatible endpoints natively via the Generic OpenAI provider. You set the Base URL to your Privatemode proxy endpoint and enter the model name. All workspace features, including chat, RAG, and agents, work automatically.
Integration
n8n
Add cloud-grade AI to your n8n workflows while keeping all data encrypted end-to-end through confidential computing.
Integration
PrivateGPT
Use Privatemode's encrypted AI API with PrivateGPT to chat with your documents without exposing sensitive files to any cloud provider.

Want to look for yourself?