project screenshot 1
project screenshot 2
project screenshot 3
project screenshot 4
project screenshot 5
project screenshot 6

InferAI

A single click deploying of your ML models on a decentralized compute and storage powered by Filecoin Virtual Machine, Bacalhau & Libp2p

InferAI

Created At

HackFS 2023

Winner of

trophy

🥇 Bacalhau — Best Use

Project Description

The goal of this platform is to allow users to deploy their AI models with as much ease as possible, while not caring about Scaling, managing or cost.

we have look a lot of inspiration from other services like Vercel & Huggingface for how we want to provide easy developer experience while providing some features we felt were hugely important & missing in general deployment infrastructure like decentralized computation.

Other part of the project is that anyone can choose to be a compute provider, and earn tokens. While deploying your model, you can choose who would you like to be the compute provider for your model, for which you will pay as you go as your model is being used by other people.

Users can also explore all the deployed models & compute providers, and play with the model by buying tokens.

How it's Made

The project uses Bacalhau, Docker & python lang. to make the model deployment possible. The Model files are being stored on IPFS using NFT.Storage. Providing the compute feature has been built using FVM & Libp2p.

All the list of deployed models & providers are stored using Polybase.

At last, buying tokens & transaction are made possible using Wagmi, Rainbowkit & Viem.

background image mobile

Join the mailing list

Get the latest news and updates