Integrating Privatemode AI with AnythingLLM

AnythingLLM turns documents into AI-powered workspaces, but cloud LLM providers see every query in plaintext. Privatemode encrypts all workspace data end-to-end while giving you access to state-of-the-art models.

Anything LLM logo

Introduction

Use cloud AI in AnythingLLM without exposing your workspace data

Screenshot of Anything LLM with Privatemode

AI workspaces need powerful models

AnythingLLM lets you build custom AI workspaces for document Q&A, agents, and RAG. Cloud LLM providers deliver the best results, but they process your documents and queries in plaintext. For sensitive business data, that's a dealbreaker.

Encrypted document interaction

Privatemode provides an OpenAI-compatible endpoint that AnythingLLM connects to via the Generic OpenAI provider. Every document chunk, embedding, and chat message is encrypted end-to-end. The AI provider never sees your workspace data in plaintext.

Hardware-enforced, not only policy-enforced

Based on confidential computing, Privatemode processes your data inside a hardware-enforced enclave. This is verifiable through remote attestation and open-source code, going beyond standard cloud providers' privacy policies.

Benefits

Why use Privatemode AI with AnythingLLM?

End-to-end encrypted workspace AI

Integrating Privatemode into AnythingLLM brings cloud-AI capabilities to your document workspaces while keeping your data confidential. Every query, document chunk, and agent action is encrypted through end-to-end encryption and confidential computing. Privatemode never learns from your data.

State-of-the-art model selection

With Privatemode, you can choose from state-of-the-art LLMs to power AnythingLLM's chat, RAG, and agent features, all running confidentially inside hardware-enforced enclaves.

Native Generic OpenAI support

AnythingLLM has built-in support for custom OpenAI-compatible providers via the Generic OpenAI option. Set the Base URL to your Privatemode proxy and all workspace features work instantly, no plugins needed.

How to get started

How to set up Privatemode in AnythingLLM

Privatemode sign-up form

Get your API key


If you don't have a Privatemode API key yet, you can generate one for free here.

docker run -p 8080:8080 ghcr.io/edgelesssys/privatemode/privatemode-proxy:latest --apiKey <your-api-key>

Run the Privatemode proxy


The proxy verifies the integrity of the Privatemode service using confidential computing-based remote attestation. The proxy also encrypts all data before sending and decrypts data it receives.

Anything screenshot

Set up AnythingLLM


Download AnythingLLM from the official website and open the application on your computer.

Configure AnythingLLM to use Privatemode

FAQ

Frequently asked questions about using Privatemode with AnythingLLM

No. AnythingLLM supports custom OpenAI-compatible endpoints natively via the Generic OpenAI provider. You set the Base URL to your Privatemode proxy endpoint and enter the model name. All workspace features, including chat, RAG, and agents, work automatically.

Integrations

View more

Explore other Privatemode integrations

Integration

n8n

Add cloud-grade AI to your n8n workflows while keeping all data encrypted end-to-end through confidential computing.

Read guide
n8n logo

Integration

PrivateGPT

Use Privatemode's encrypted AI API with PrivateGPT to chat with your documents without exposing sensitive files to any cloud provider.

Read guide
PrivateGPT logo

Talk to an expert

Have a questions about Privatemode? Let's talk!

Want to look for yourself?