Dromeus

Decentralized, private federated ML training marketplace over Gensyn AXL's encrypted P2P mesh.

Dromeus

Created At

Open Agents

Winner of

Gensyn

Gensyn - Best Application of Agent eXchange Layer (AXL) 1st place

Project Description

DROMEUS is a decentralized compute marketplace designed for privacy-preserving, federated machine learning. In traditional ML workflows, training data must be centralized on a single server or cloud provider, creating massive privacy risks and data-silo bottlenecks. DROMEUS bypasses traditional client-server architecture. All node-to-node communication runs exclusively over Gensyn's Agent eXchange Layer (AXL). The model weight deltas transmitted across the network cannot be intercepted, decrypted, or read by any intermediary nodes, ISPs, or central authorities.

How it's Made

  • The ML Application Stack: Dromeus uses Python backend utilizing PyTorch for Federated Averaging (FedAvg). A Coordinator daemon orchestrates the training jobs, while distributed Worker daemons execute user-provided PyTorch scripts as secure subprocesses. The user-facing dashboard is built in Next.js 16 and uses Server-Sent Events (SSE) to stream live loss/accuracy charts.
  • Partner Tech (Gensyn AXL): All networking runs on Gensyn’s Agent eXchange Layer (AXL). Both Coordinator and Worker nodes run an AXL Go binary locally. Our Python apps never open external web sockets; they solely talk to their local AXL HTTP bridge.
  • Encrypted P2P Privacy: Because of Gensyn AXL, we gained out-of-the-box, end-to-end (Layer 2) payload encryption over the Yggdrasil IPv6 mesh network. This ensures that the federated model weights passing between peers remain entirely private and cannot be intercepted by network intermediaries.
  • Trustless Web3 Monetization: The Coordinator programmatically triggers USDC micropayments on the Sepolia network. The worker node independently verifies the on-chain transaction hash before it ever begins expending compute power on a training round.
  • Agentic Capabilities: To align with OpenAgents, we integrated a native MCP (Model Context Protocol) server. This allows AI agents to query the network, discover online workers, and autonomously submit/pay for ML training jobs via the A2A protocol.
background image mobile

Join the mailing list

Get the latest news and updates