A privacy preserving, decentralised and community federated AI inference service leveraging p2p and progcrypto
This project aims to use Waku for peer to peer communication between inference providers and inference users - thus protecting user's privacy.
The system architecture consists of three primary components: client nodes, AI inference nodes, and content topics. Client nodes generate and publish encrypted prompts to specific content topics. AI inference nodes subscribe to these topics and process the prompts using specified language models. The response delivery mechanism ensures that clients can verify the authenticity and quality of responses without compromising their anonymity.
Communication flow begins with client generation of a unique session key and encryption of the prompt with session-specific parameters. The encrypted prompt is published to relevant content topics, where multiple AI nodes process it independently. Responses are signed by nodes and published to response topics, allowing clients to verify node signatures and reputation while maintaining privacy.
A electronjs based desktop app bundled with nwaku. The project uses functional encryption to combat against the logging of PIIs in the user's prompt. The hacky part of the project would be using Ollama instead of complicating the process of running a ai model locally. The project also implements experimental FE to ensure the privacy of PIIs