Video thumbnail

YAAA-WHO

Chrome Extension Decentralized AI Agent to scrape, understand and answer questions from your personalised websites, bookmarks and reading lists.

YAAA-WHO

Created At

Agentic Ethereum

Project Description

Build your personal search engine; and control its data reliability/authenticity with the specific data sources of your choice.

Use Cases & Advantages:

  • Now no longer need to read through complex technical documents, let AI read it, digest it and give you its summary or step by step instructions instantly.
  • No longer need to store all those (never-to-be-reverted) bookmarks and reading lists that rarely receive a second glance. Have AI take care of reading, categorising and storing them on your behalf for any future reference.
  • Got a new idea or found something interesting, mark all the relevant data sources and let AI save it forever; but a quick and easy access to retrieve it without even a search button.

Yet-Another-AI-Agent (YAAA) is a Chrome Extension with multiple features:

  • With a single click of button, scrape across any website, blog, document, wiki.
  • Builds one or more custom RAG models. You can categorise RAG models based on your preference: say one each for Home, Vacation, Coding, Kids, Pets etc.
  • Any Agent (Agent WHO) can then be allowed access to the RAG data toolset to answer user's questions based on their specific RAG datasets.
  • For Future enhancement, we can have agents even analyse these personalised RAG themes and patterns to provide similar suggestions around new events, products, news etc as well.

How it's Made

YAAA-WHO is built using multiple components.

Step 1: Chrome Extension is a vanilla HTML, CSS and JS code, and is integrated with APIs to interact with the agents and vectorDBs. With a click of a button, the extension captures the current Tab's URL and issues a command in the backend to trigger the scraping and building the Database.

Step 2: Setup the Vector DB with Postgres, with backup with local ChromaDB instance. This is used to store and retrieve vector data and assist the agents to process the user queries.

Step 3: The AI Agent was planned to be deployed on Autonome. But due to technical issues and limitations with deploying the agent there, we shifted to Agno (previously PhiData) framework and a local Ollama Agent instance. The Agent takes in the URL input from the extension, uses it to scrape the website, split it into a vector data and stores it into the Vector database.

Step 4: As the last step, to ensure that the user data is always available, kept secure and private, we integrated with Nillion to store all the user's personal bookmarks and URL links privately and in a decentralised manner so that it can be retrieved from any RAG and Agent models in future. These URLs can be access only by the associated user's wallet ID.

Future Steps: We can also integrate a way for the Agents to self-pay for the RAG storage space depending its usage using the associated wallet.

background image mobile

Join the mailing list

Get the latest news and updates