Immersive metaverse to discover the on-chain and off-chain stories of every artwork ever created.
Stream of Consciousness is a decentralized immersive discovery engine designed to reimagine how art is archived, valued, and remembered in the age of Web3.
In today’s world, an artwork’s value is never just in the token or the image itself — it’s in the network of meanings that surround it: the artist’s intent, past ownership, critical reception, exhibitions, tweets, and the emotional resonance it leaves behind. Yet, this contextual web is scattered — buried across blockchains, social media, and unindexed archives.
Stream of Consciousness addresses this fragmentation by building an autonomous curatorial layer on top of blockchain infrastructure. It continuously tracks NFT mints across Ethereum, gathers their associated off-chain data, and generates a living, evolving “consciousness” for each artwork — an intelligent profile that weaves together provenance, metadata, and narrative. The community is incentivised to participate actively in the curation process as they are minted credit for an artwork's awakening.
The broader goal is to decentralize curation itself because unless curation is decentralised, culture gets lost and only what centralised institutions deem important remain.
In Web3, we have the opportunity to democratize that process, allowing any artwork, regardless of fame or wealth, to be awakened, indexed, and preserved.
The project stands as an early prototype of a permanent cultural archive — one that unites blockchain verifiability with AI-powered interpretation. Over time, as more artworks are awakened, a vast decentralized network of living archives begins to emerge — a “Stream of Consciousness” for art itself, flowing across chains and cultures.
The project is built through four integrated layers — the Envio indexer, the Effects data normalisation layer, the AI agent curator, and a React frontend. The Envio indexer continuously listens for all ERC-721 mint events on Ethereum and for a custom “Awakening” contract that triggers whenever a new curation is generated.
Using a schema.graphql file for entity definitions and TypeScript event handlers, it writes indexed data to a local database and exposes it via a GraphQL endpoint. Multiple RPC URLs are configured in the environment file to distribute load and avoid dropped connections. The Effects layer—comprising tokenURI.ts, sanitize.ts, and metaData.ts—cleans and merges NFT metadata from IPFS, Arweave, and HTTP endpoints, standardising broken or inconsistent data before it reaches higher layers.
The ASI Agent layer acts as an multistep autonomous curator: it first analyses all indexed information, then searches the web using the Bright Data API for relevant context such as exhibitions, articles, and social posts, and finally synthesises a narrative “curation file” that is linked back to the NFT’s on-chain record.
The React frontend turns this data into an immersive gallery where artworks appear as animated floating cards displaying both on-chain provenance and off-chain context. Each component is connected through lightweight APIs so that when an artwork is “awakened,” its indexer entry, curation link, and display layer all update in real time.
Because the system handles large volumes of blockchain and off-chain data, it is computationally demanding; the current version operates entirely on free-tier RPC and API keys, which introduces minor reliability limitations under heavy load. Centralised institutions are gatekeeping their data hard these days so sometimes additional search or real time contet might be missing.

