TARS: A decentralized network of AI agents that capture, verify, and act on social/environmental issues using EigenLayer AVS. From Ray-Ban Meta glasses to DAO proposals, creating a trustless pipeline for social impact initiatives.
TARS (Transformative Action Recognition System) revolutionizes how we address social and environmental issues by combining wearable tech, AI agents, and blockchain governance.
The problem: While people encounter numerous social issues daily, there's no automated, trustworthy system to document, verify, and act on these observations. Current solutions are either centralized, manual, or lack verification.
TARS solves this through a four-layer system:
Key Features:
Future Vision: TARS aims to create a global network of AI agents and wearable devices that continuously monitor and address social/environmental issues, making social impact initiatives more efficient, transparent, and actionable.
I built TARS as a bridge between real-world social initiatives and Web3 governance, starting with the challenge of verifying media from Ray-Ban Meta glasses. My core innovation is a custom AVS (Actively Validated Service) on EigenLayer that verifies image authenticity, paired with Claude Vision for AI analysis. This combination allows me to reliably process media from capture to verification while maintaining data integrity through a multi-gateway IPFS storage system using Pinata.
The trickiest part was building the impact assessment system that powers the DAO. I developed a network of TypeScript agents using the ELIZA framework to gather context from multiple sources—weather data, local news, and location information—to score social initiatives. These agents feed into Arbitrum-based DAO smart contracts, which handle community voting and fund allocation.
I used the Exifr library to extract raw image metadata, EigenLayer to build the AVS, and Pinata IPFS to store all files and media for verification. The system integrates the Claude Vision API, Dynamic Wallet for human accessibility, and Arbitrum Sepolia Testnet for the DAO smart contracts and voting. For the frontend, I used React JS, and last but not least, Ray-Ban Meta Smart Glasses to capture media.
The end result is a system that successfully bridges the gap between real-world social initiatives and decentralized governance while maintaining data integrity and user trust throughout the entire pipeline.