project screenshot 1
project screenshot 2
project screenshot 3
project screenshot 4
project screenshot 5

Kosh

Simple and secure way to interact with remote MCP Servers in Trusted Execution Environments (TEEs)

Project Description

🤔 Problems Identified

  • Anthropic's Model Context Protocol (MCP) is fairly new and currently requires manual setup and some familiarity with coding.
  • Since MCP enables users to interact with their private data, it becomes increasingly important to keep the data and execution environment secure.

⚡️ Solution

  • Connect to MCP Servers in just a few clicks, with absolutely no coding required.
    1. Select the servers you need. Currently offers 4 servers:
      • Brave Search [Add search capabilities to your chatbot]
      • Slack [Interact with Slack]
      • GitHub [Interact with GitHub]
      • RAG (Retrieval Augmented Generation) capabilities powered by Nilrag. We built a custom nilRAG MCP server to allow users to upload any data, store them securely in a nilDB and query them using nilAI.
    2. Provide the API keys required.
    3. Start interacting with the server in real time.
  • All the MCP servers are deployed in a TEE environment powered by Marlin, meaning all interactions are private and hardware-level secure.
  • We modified the existing community MCP servers to support Cloudflare's Remote MCP protocol. This eliminates the need for running local proxy servers and allows users to access MCP servers from anywhere, rather than requiring them to be on the same machine as the client.

How it's Made

So currently in the LLM workflows, the llm agent is now fully secure and running in TEEs (e.g Nillion), and the only thing that isn't secure is the LLM tooling.

MCP Protocol Integration

We adopted the MCP Protocol—a recent innovation—to secure LLM tooling. We built a comprehensive framework that allows any MCP tool written in TypeScript to be seamlessly deployed to a TEE. This capability ensures end-to-end security across the entire solution.

Merlin Deployment

To achieve a fully secure deployment, each service is individually deployed to TEEs using Merlin, a system optimized for secure execution environments.

Demo Implementation and Technologies

We've implemented a functional demo website showcasing integrations with four specific tools:

GitHub

Slack

Nillion DB/RAG (Retrieval-Augmented Generation)

Brave Search

The MCP server we've developed is highly versatile and can run on Claude, Windsurf, Cursor, and other platforms supporting the MCP Protocol. Configuration for these integrations is openly available on GitHub.

Smart Fixes and Notable Hacks

Cloudflare’s mcp-remote: Typically, an MCP client requires a local proxy. However, Cloudflare recently introduced mcp-remote, enabling direct connections to an MCP server without a proxy. Despite its novelty (just a few days old at implementation time), we successfully utilized this solution using a fresh guide that was only one day old.

Custom Dockerfiles: Given how new the mcp-remote feature was, we wrote custom Dockerfiles and ported several tool-specific functions to ensure compatibility.

SSL Encryption Workaround: We encountered SSL encryption issues during deployment to Merlin. We addressed this by creatively implementing Myzork to bypass these SSL complications.

Python Integration via FastAPI: Nillion RAG initially lacked MCP server support. We addressed this by implementing a dedicated FastAPI server in Python to act as a proxy for TypeScript integration.

API Key Handling: Currently, for ease of demonstration and interaction, we're using hardcoded API keys across services. This was a practical decision to quickly demonstrate capability during development.

Example Use Case

An illustrative query handled by our solution includes:

"Find issues in xx repo and post it in Slack, then react with a thumbs-up."

This clearly demonstrates the seamless integration and secure orchestration capability of our secure LLM workflow.

background image mobile

Join the mailing list

Get the latest news and updates