About

Fluence is a DePIN enabling a cloudless compute platform that provides a resilient, open and low-cost alternative to traditional cloud computing. The platform aggregates CPU resources from enterprise-grade data centers around the world in a global, decentralizde, resilient and cost-effective DePIN platform, allowing users to go cloudless and avoid vendor lock-in and subjective deplatforming.

Prizes

๐Ÿง  Best Use of Fluence Virtual Servers โธบ $5,000
๐Ÿฅ‡
1st place
$2,500
๐Ÿฅˆ
2nd place
$1,500
๐Ÿฅ‰
3rd place
$1,000
A total of $5,000 will be awarded to the most impactful use of Fluence Virtual Servers. ๐ŸŽฏ Goal Demonstrate a working project using CPU-only VMs from Fluence. Expectations > Efficient AI application deployed on Fluence VMs. > Examples: small/quantized models, inference APIs, agentic LLM backends. > Ideally, build toward CPU-only compatibility, no GPUs required.

Qualification Requirements

Qualification Requirements 1. Private GitHub Repo > Keep your repository private until submission. > Add @justprosh as collaborator before the deadline. 2. Comprehensive Documentation Include a `README.md with: > What it does: Project purpose and overview. > How to set it up: Full setup instructions. > How to run it: Runtime guidance and usage steps. > Examples: Demo inputs/outputs, screenshots, or API calls. 3. Deployment & Access You must: > Deploy your project on Fluence Virtual Servers > Provide one of the following: - A public endpoint + usage instructions + VM id or - A Terraform or Ansible script to launch your environment on a Fluence VM 4. Licensing Include an open-source license (e.g., MIT or Apache 2.0).