With Privatemode, you can bring state-of-the-art cloud-based AI to your n8n workflows while keeping full control over your data.
Many users run n8n locally to maintain control over their data. These users are faced with a dilemma when they want to add AI to their workflows: Use the power of cloud-based AI and give up privacy or use limited self-hosted AI and retain privacy.
With Privatemode, users no longer have to choose. Based on confidential computing, Privatemode gives n8n users the strength of cloud-based AI while keeping their data always encrypted and protected—even during processing.
Integrating Privatemode into n8n brings powerful AI capabilities to your workflows while keeping your data fully confidential through end-to-end encryption and confidential computing. Plus, Privatemode is designed to never learn from your data.
With Privatemode, you can choose from state-of-the-art LLMs to power your workflows.
You can integrate Privatemode into your workflow in just minutes. The setup is straightforward and fully compatible with all native AI nodes in n8n.
If you don’t have a Privatemode API key yet, you can get a free trial key here.
docker run -p 8080:8080 \
ghcr.io/edgelesssys/continuum/continuum-proxy:latest \
--apiKey <your-api-token>
The proxy verifies the integrity of the Privatemode service using confindential computing-based remote attestation. The proxy also encrypts all data before sending and decrypts data it receives.
Follow the official n8n tutorial to set up an AI agent. Since Privatemode AI adheres to the OpenAI interface specification, you can use the standard OpenAI Chat Model node as outlined in the tutorial. However, you'll need to reconfigure the credentials used in Step 5 of the tutorial. In addition to the AI Agent node, the OpenAI Chat Model can be used in several other powerful AI nodes, including:
Adjust the credentials for the OpenAI Chat Model node. This ensures that no data is sent to OpenAI, but only to your Privatemode proxy, which establishes an end-to-end encrypted channel to our confidential AI service.
Set the parameters of the OpenAI Chat Model node to use the correct credentials and select the model you want to use:
You've successfully integrated Confidential AI into your workflow.