
🪼 To qualify, your project must integrate Fluence Cloudless Compute: • Host your backend or core logic on Fluence CPU Cloud. • Optionally leverage the Fluence GPU Containers API to power your compute-intensive workloads (e.g., ML inference, rendering, etc.). 🪼 Your submission must also include a README.md file covering: • What it does: Clear project overview and purpose. • How to set it up: Step-by-step setup and installation instructions. • How you use Fluence: Explain how Fluence APIs, instances, and hardware (CPU/GPU) are used in your architecture. • How to run it: Execution instructions, environment variables, or command examples. • Examples & visuals: Include sample inputs/outputs, screenshots, or example API calls. 🪼 Your project must be live and accessible: • Deployed on the Fluence Platform. • Include a public endpoint or access link. • Provide clear usage instructions (how to interact with the deployed version). 🪼 To ensure openness and reusability: • Include a recognized open-source license in your repository (e.g., MIT, Apache 2.0, or similar).
ETHGlobal Buenos Aires