Simple and secure way to interact with remote MCP Servers in Trusted Execution Environments (TEEs)
🤔 Problems Identified
⚡️ Solution
So currently in the LLM workflows, the llm agent is now fully secure and running in TEEs (e.g Nillion), and the only thing that isn't secure is the LLM tooling.
MCP Protocol Integration
We adopted the MCP Protocol—a recent innovation—to secure LLM tooling. We built a comprehensive framework that allows any MCP tool written in TypeScript to be seamlessly deployed to a TEE. This capability ensures end-to-end security across the entire solution.
Merlin Deployment
To achieve a fully secure deployment, each service is individually deployed to TEEs using Merlin, a system optimized for secure execution environments.
Demo Implementation and Technologies
We've implemented a functional demo website showcasing integrations with four specific tools:
GitHub
Slack
Nillion DB/RAG (Retrieval-Augmented Generation)
Brave Search
The MCP server we've developed is highly versatile and can run on Claude, Windsurf, Cursor, and other platforms supporting the MCP Protocol. Configuration for these integrations is openly available on GitHub.
Smart Fixes and Notable Hacks
Cloudflare’s mcp-remote: Typically, an MCP client requires a local proxy. However, Cloudflare recently introduced mcp-remote, enabling direct connections to an MCP server without a proxy. Despite its novelty (just a few days old at implementation time), we successfully utilized this solution using a fresh guide that was only one day old.
Custom Dockerfiles: Given how new the mcp-remote feature was, we wrote custom Dockerfiles and ported several tool-specific functions to ensure compatibility.
SSL Encryption Workaround: We encountered SSL encryption issues during deployment to Merlin. We addressed this by creatively implementing Myzork to bypass these SSL complications.
Python Integration via FastAPI: Nillion RAG initially lacked MCP server support. We addressed this by implementing a dedicated FastAPI server in Python to act as a proxy for TypeScript integration.
API Key Handling: Currently, for ease of demonstration and interaction, we're using hardcoded API keys across services. This was a practical decision to quickly demonstrate capability during development.
Example Use Case
An illustrative query handled by our solution includes:
"Find issues in xx repo and post it in Slack, then react with a thumbs-up."
This clearly demonstrates the seamless integration and secure orchestration capability of our secure LLM workflow.