Prophet

Multi Agent System that analyzes DAO proposals and votes on your behalf.

Prophet

Created At

ETHGlobal New Delhi

Project Description

The project is a one-stop solution to the DAO Participation problem in which the incredible amount of data, sheer speed of progress in the space as well as complexity effectively centralizes power defeating a DAO's purpose.

Through a combination of 4 on-chain uagents, that with the help of ASI create a single point that our browser-based frontend can query our project allows users to analyze DAO proposals and recommends voting decisions based on various on chain as well as off chain factors. It portrays clearly risk factors, stastics and intakes user constitution to tailor recommendations. the agents are built on top of ASI

To ensure complete clarity it allows for chatting with the agents about the proposals. It combines forum results, historically similar proposals, statistical data and on chain data through Subgraphs as well as our own deployed substreams via The Graph , Dune API,Etherscan, Defillama to ensure the freshest and most comprehensive data is used to train our model. Data is streamed to our db using a substream sql sink we developed.

To ensure complete transparency, it utilizes Hedera to log all the thinking and discussions on chain. Effectively our project aims to ensure that all stakeholders can participate in deciding a DAO's future ensuring a more complete decentralization.

We have also tried to make it possible for the end user to delegate the agent to vote on the user's behalf.

on a side note - we built a substream for uniswap governance data on the graph and published it as well. https://substreams.dev/packages/uniswap-governance/v0.1.0

How it's Made

Technologies used : programming languages : python, rust, typescript frameworks: flask, react web3 technologies : hedera consensus service database : postgres frontend tech : react + tailwindcss , axios agent : uAgent, ASI:one, gemini llm, flask, Agent to Agent communication.

data pipeline :

  • graph subgraphs
  • dune api

how the data works - we built a substream [https://substreams.dev/packages/uniswap-governance/v0.1.0] , this is a custom substream, streaming uniswap governance data (proposal and votes). we also built a sink to stream this data to our postgres server. our backend then relies on the postgres server to get the proposal data.

we take the description of the proposal - and then using that, extract the governance forum link, and then append .json to the url, then passing it down to gemini, and filtering required data (like the comment, how many likes the comment had) , this is going to be used for sentimental analysis.

we also have a pipeline to get dao-metrics like tvl, treasury balance and others.

so the frontend asks the agent - to analyze a proposal with any id, the agent then calls our backend, gets the dao metrics and proposal specific data (both on chain data using our db , and off chain data), and then the magic happens.

we used a script to populate a vector that analyzed all the executed proposals for a dao (in our case it was uniswap)

each agent step is logged to hedera using hedera consensus service (.hcs)

concierge agent, hits the backend, with the proposal id, and gets the data from the backend, now the concierge passes this data to archivist agent, archivist then uses gemini embedding model to convert this data into embeddings and compare it with the previous embeddings in the vector database, and then the context is sent to concierge.

next, the concierge agent passes context, proposal data, dao metrics to the analyst, which does a neutral analysis using ASI and returns it back to the concierge.

finally, concierge sends the analysis , and the user's own constitution to the strategist, and then it gives a final recommendation, based on user's persona, it returns it back to the concierge.

concierge then returns the final recommendation and analysis, strategy to the frontend.

background image mobile

Join the mailing list

Get the latest news and updates