Video thumbnail

tars

TARS: A decentralized network of AI agents that capture, verify, and act on social/environmental issues using EigenLayer AVS. From Ray-Ban Meta glasses to DAO proposals, creating a trustless pipeline for social impact initiatives.

tars

Created At

Agentic Ethereum

Winner of

EigenLayer - Eigen Agents

Prize Pool

Project Description

TARS (Transformative Action Recognition System) revolutionizes how we address social and environmental issues by combining wearable tech, AI agents, and blockchain governance.

The problem: While people encounter numerous social issues daily, there's no automated, trustworthy system to document, verify, and act on these observations. Current solutions are either centralized, manual, or lack verification.

TARS solves this through a four-layer system:

  1. Verification Layer:
  • Custom EigenLayer AVS (Actively Validated Service) ensures media authenticity
  • Decentralized operator network verifies content from Ray-Ban Meta glasses
  • Cryptographic signatures prevent tampering and establish chain of custody
  • Multiple IPFS gateways for reliable, decentralized storage
  1. Media Analysis Layer:
  • AI agent processes verified media using Claude Vision
  • Extracts comprehensive metadata (location, time, context)
  • Aggregates local weather history and relevant news
  • Generates detailed environmental/social impact analysis
  1. Impact Assessment Layer:
  • Specialized agent evaluates issues using multi-factor scoring
  • Automatically generates DAO proposals for high-impact issues
  • Smart contract integration for transparent fund management
  • Community voting and proposal execution
  1. Agent Network Layer:
  • Scalable network for multiple AI agents and wearable devices
  • Automated coordination between verification and analysis agents
  • Cross-agent data sharing and consensus
  • Expandable framework for future agent integration

Key Features:

  • Trustless media verification
  • Automated context gathering
  • Impact-based prioritization
  • Decentralized governance
  • Transparent fund allocation
  • Real-time social issue monitoring

Future Vision: TARS aims to create a global network of AI agents and wearable devices that continuously monitor and address social/environmental issues, making social impact initiatives more efficient, transparent, and actionable.

How it's Made

I built TARS as a bridge between real-world social initiatives and Web3 governance, starting with the challenge of verifying media from Ray-Ban Meta glasses. My core innovation is a custom AVS (Actively Validated Service) on EigenLayer that verifies image authenticity, paired with Claude Vision for AI analysis. This combination allows me to reliably process media from capture to verification while maintaining data integrity through a multi-gateway IPFS storage system using Pinata.

The trickiest part was building the impact assessment system that powers the DAO. I developed a network of TypeScript agents using the ELIZA framework to gather context from multiple sources—weather data, local news, and location information—to score social initiatives. These agents feed into Arbitrum-based DAO smart contracts, which handle community voting and fund allocation.

I used the Exifr library to extract raw image metadata, EigenLayer to build the AVS, and Pinata IPFS to store all files and media for verification. The system integrates the Claude Vision API, Dynamic Wallet for human accessibility, and Arbitrum Sepolia Testnet for the DAO smart contracts and voting. For the frontend, I used React JS, and last but not least, Ray-Ban Meta Smart Glasses to capture media.

The end result is a system that successfully bridges the gap between real-world social initiatives and decentralized governance while maintaining data integrity and user trust throughout the entire pipeline.

background image mobile

Join the mailing list

Get the latest news and updates