Decentralised AI agents framework on top of Fluence and Filecoin
FluenceAI revolutionizes AI agent creation by replacing traditional cloud technologies with decentralized alternatives. Leveraging Fluence, it orchestrates AI agents to perform various tasks such as making RPC requests, retrieving files from Filecoin/IPFS, and utilizing open datasets stored on Filecoin. This framework sets the stage for future enhancements, including running LLM models optimized for CPU inference, like llama-rs, and constructing robust AI data pipelines.
To build this project, I started by creating a Fluence marine service that proxies requests to the OpenAI API. Using Aqua, I orchestrated these requests to implement techniques like self-consistency in interactions with large language models (LLMs).
Next, I developed a gateway to facilitate interactions with these marine services, providing a seamless interface for users. The final step involved creating a simple OpenAI-like playground for chat, enabling easy experimentation and interaction with the AI agents.
Additionally, I explored using llama-rs to build a Fluence service for CPU-based inference, though this is still a work in progress. I also attempted to implement a Python code interpreter within a Fluence marine service to allow developers to execute Python code, enabling the creation of advanced data analysis tools. While this feature is not yet complete, it showcases the potential for extending the framework's capabilities.