project screenshot 1
project screenshot 2
project screenshot 3
project screenshot 4
project screenshot 5
project screenshot 6

ZK Section 9

Generating Succinct ZK Proofs for Content-Generating Machine Learning Models; Proving that AI-Generated content is authentic with ZKML

ZK Section 9

Created At

ETHSanFrancisco 2022

Winner of

šŸ¤ ENS ā€” Integration Bounty

šŸ”Ÿ Optimism ā€” Top 10 Deployed

šŸ† ETHSanFrancisco Finalist

Project Description

Section 9: Generating Succinct ZK Proofs for Content-Generating Machine Learning Models.

Presentation: https://docs.google.com/presentation/d/1CIxGfM_oySgWkgdNZqh-hoeH8C8zJDxnEpHx_VK6-KQ/

We trained a machine learning model for on-chain AI art generation, then created a novel tool that converts it (and lots of other models) to zero-knowledge proof verifiable on the blockchain. We finally created a variant of the ERC721 protocol that harnesses this capability to empower Web 3 art creators.

What did we do:

  1. We trained a machine learning model to produce AI-Generated Content.

  2. We encoded the modelā€™s art generation process into an arithmetic circuit to be consumed by a zk-SNARK (Zero-Knowledge Succinct Non-Interactive Argument of Knowledge) prover on-chain (Solidity).

  3. We created a novel ERC-721 based NFT protocol incorporating the zk-SNARK verifier that allows AI artists to uniquely mint and certify their creations.

  4. We created & open-sourced a generalized tool & framework to convert almost any machine learning algorithms into zero-knowledge proofs that are easy integrable with the blockchain.

Why is it significant?

  • AI Art is taking over!
  • Lots of recent progress in machine learning generating astonishing results: DALL-E 2, Midjourney, Stable Diffusion, etc.
  1. We want to bring that to the Web 3 community!
  2. We want to properly assign credits to art creators, art owners and art ā€œsuppliersā€

Why is it hard?

  1. Machine learning models are huge: e.g. FastBERT[1]: ~1800 MFLOPS (million floating point operations). Normally you canā€™t run it on a blockchain.

  2. Zero-knowledge-proofs are hard to generate and operate: ceremony, contribution, beacon, etc. Not to mention the speed.* Frankly, the three key stakeholders donā€™t talk to each other at the moment.

2.1. Machine learning engineers donā€™t understand the tooling and requirements to convert an algorithm to a blockchain-friendly format

2.2 ZKP developers lack visibility into the capabilities of AIGC models and the performance / ecosystem / adoption gap

3.3 Creators hoping to leverage both Web 3 & AI lack guidance and support

  1. Liu, Weijie, et al. "Fastbert: a self-distilling bert with adaptive inference time." arXiv preprint arXiv:2004.02178 (2020).
  • Our core circuit took over 2 hours to complete on a beefy 96-core machine.

How it's Made

How Itā€™s Made

Renowned AI artists teaming up with ZK experts and veterans in machine learning model deployment

Contribution 1: A blockchain-friendly AIGC model

We trained a tiny machine learning model[1] trained on artworks throughout history[2]. This represents a typical workflow for a creator leveraging AI model. The model takes inspiration from Wasserstein GAN with gradient penalty, as well as spectral normalization.

We created a novel method that rewrites all floating point operations in this model into integer-only operations, which made it blockchain- and zk- viable. We combined several techniques in integer-only-quantization (previously only available in TF-Lite), ported it to PyTorch, then resolved issues of decreased generator performance in the presence of training-time quantization. We also introduced a new scheme of integer-only re-quantization by restricting the scale of powers of two. We also fused batch normalization and ReLU units into 2D convolutions for simplified quantized inference.

  1. Itā€™s called generative adversarial network, a type of model used widely for AI-generated content for images.
  2. https://github.com/liaopeiyuan/artbench

Contribution 2: A ML -> ZK model translator

We then generalized our tailored approach into a translator that translates machine learning models in PyTorch (a very popular framework) into arithmetic circuits in Circom (a popular zk-SNARK proof generator). We first used TorchScript's JIT compiler to generate a computational trace of the generator in fully quantized form, then imported it to an intermediate representation in TVM that includes passes that automatically translate certain divisions into fixed point multiplications. We then implemented a graph runtime in Python to build up a neural network in Circom that supports a wide range of operators ranging from convolution to matrix multiplication. To mimic typed and untyped integer arithmetic expressed in TVM, we crafted specialized Circom building blocks with clear bound checks and used elements greater than p/2 for negative numbers.

Our method is capable of generating models with hundreds of components and millions of parameters - really limited by the inherent performance of ZK systems.

The code is open-sourced at: https://github.com/zk-ml/uchikoma

Almost infinite rooms for imagination:

  1. (shown) Vision models -> AIGC

  2. Language models -> chatbot, writing assistant

  3. Linear models and decision trees -> fraud detection, sybil attack prevention

  4. Multi-modal models -> recommender systems

Contribution 3: An creators-empowering NFT protocol

Finally, we bootstrapped the classic ERC-721 protocol to incorporate ML & ZK capabilities to give artists and creators more control over their generated content.

We used EBMP[1] (created by our team member) to directly render on-chain images using data stored in smart contracts. This is done with inspiration from dom's technique by manually concatenating bytes as specified by the BMP protocol and placing that into an SVG base64 string.

The ZK prover oversees that the image is indeed generated by a machine learning model. Only the holder of the machine learning algorithm (presumed to be the AI artist) is able to mint the model. Given a public AIGC model (e.g. StableDiffusion), each creator can prove their contribution either to the original dataset that trained the model or by carefully designed prompts. Our ZK prover can hold both assertions true without revealing the artistsā€™ identities of prompts. We could directly export a solidity-based verifier for Groth16 in snarkjs with respect to our R1Cs circuits.

  1. https://github.com/0xmostima/EBMP

What this means for our sponsor:

Polygon - converting machine learning models to zero-knowledge proofs have profound implications for expanding the capabilities of blockchains, enabling people in underdeveloped countries to generate income with on-chain AIGC or ZK-ML-enhanced trustless freelancing.

Optimism - governance tech in consensus is tricky. Our tools can enhance existing voting and Sybil resistance techniques by allowing an ML-based approach: imagine a self-evolving DAO smart contract powered by a neural network.

Worldcoin - generation of World ID is an intricate interplay between the Orb hardware and many algorithmic innovations, many of which are machine learning algorithms. Our generalized ML - ZK model translator could enhance this process by verifying the validity of a World ID without revealing key IPs powering the algorithm.

IPFS & Filecoin - zero-knowledge proofs create order-or-magnitudes overhead in proving key storage compared to a normal machine learning model deployment. Storing these artifacts on IPFS has many benefits, such as cost and decentralization: a proving key that allows users to easily transfer stablecoins across borders for remittance purposes may be subject to constant malicious alterings and censorship.

0x - Our generalized ML -> ZK pipeline could help the 0x on-chain infrastructure to attain machine-learning-related capabilities, such as ML-algorithm-based auto-trader, wash trading detector, smart and resilient governance smart contracts, etc.

ENS - ENS has a long way to go to be truly a soul-bound identity. Our generalized ML -> ZK pipeline could help the ENS on-chain infrastructure to attain machine-learning-related capabilities, improving in Sybil attack resistance or trustless biometrics processing.

background image mobile

Join the mailing list

Get the latest news and updates