onchain llm proxy with automated bill pay we used flare to get proxy data on chain and charge!
Prize Pool
Most of our business logic runs in Cadence on Flow. The frontend is built with TypeScript, HTML, and CSS. We created a Flare connector (in Cadence) that streams per-key LiteLLM usage back on-chain so the contract can govern keys. Flow’s transaction model lets us encode access controls across multiple actions and execute them atomically in a single tx.
We also provide a Docker Compose setup around the open-source liteLLM proxy so all model calls go through a controlled endpoint; the proxy runs inside a TEE for added isolation.
On Flow, we built our Cadence smart contracts to run on-chain billing and access control for AI models. Each model gets a resource-backed vault funded in FLOW; the contract issues OpenAI-compatible API keys with optional expirations and scoped permissions. A Cadence/Flare connector streams per-key LiteLLM usage back on-chain via events, enabling real-time metering and threshold enforcement. Flow’s capability system and atomic transactions let us grant/revoke access, fund/defund vaults, and settle usage in a single tx—auto-pausing keys when balances or limits are hit.