project screenshot 1
project screenshot 2
project screenshot 3
project screenshot 4

Blob Merger

Introducing a solution to optimize blob data usage on Ethereum, aligned with EIP-4844. As blobs maintain a fixed size, not every user fills the entire capacity. Our solution processes submitted blob data, crafting efficient blobs that maximize available space.

Blob Merger

Created At

ETHGlobal Istanbul

Winner of

trophy

Arbitrum - Pool Prize

Prize Pool

trophy

Flashbots - Best Use of SUAVE 1st Place

Project Description

After EIP-4844, rollups will send data to Layer 1 in blobs. They can transmit up to 128 kB of data in one blob, but often, particularly when referring to rollup blockchains with less traffic, the data volume is lower. Rollups are required to pay fees for the entire blob, even if they utilize only a portion of it.

Therefore, it is more effective and economical to merge blobs from different rollups before sending them to L1. We have developed an app that receives blobs from various rollups and merges them to create a new blob containing data from different rollups. For example, if Optimism wishes to execute a transaction containing data worth 60 kB, Arbitrum One worth 24 kB, and Boba Network worth 33 kB, these could be combined into one blob with a data size of 117 kB. Instead of incurring fees three times for separate blobs and utilizing three separate blobs, the data will now be consolidated into a single blob, with the fees distributed accordingly among the participants.

This solution will enhance the entire Ethereum L2 ecosystem in terms of cost efficiency and speed by reducing the time rollups have to wait to include their data in a blob.

How it's Made

We are building on the Suave blockchain, and our SUAPP comprises two components. First, a precompiled contract that is responsible for merging the supplied data. Second, a regular app contract that manages the connection between users (typically rollups) and block builders. Users can submit their blob data to the app contract on the Suave blockchain, and builders can utilize the contract to receive optimized bundles.

Since determining the optimal blob composition is an NP problem (specifically, the bin packing problem), we have opted for a simple algorithm to showcase the concept: When blobs are sent to the precompile, they are sorted in descending order based on the data size. The algorithm selects the largest blob and then attempts to find the next largest blob that can still fit in, repeating this process until it traverses the entire stack. Once the stack has been searched, the merged blob is created, and the algorithm iterates through the stack again, creating another blob. This process continues until the stack is empty.

background image mobile

Join the mailing list

Get the latest news and updates