Visualize the beauty of Ethereum transactions with a force-directed D3 graph
Our project offers a captivating replay of Ethereum transactions sourced from Etherscan, specifically focusing on the top accounts ranked by ETH value within the blockchain. Using a dynamic force-directed graph, we breathe life into these transactions, representing them as particles that flow from one node to another. This visualization unfolds in a fast-forward mode, condensing time to provide a comprehensive yet expedited view.
In our selection of transactions, we've curated a subset that serves as a microcosm of Ethereum's bustling ecosystem. This includes transactions involving major players like Binance, Robinhood, Crypto.com, Kraken, and other prominent exchanges. By showcasing these transactions, we aim to offer viewers insights into the broader activity within the Ethereum network.
We use Etherscan to obtain the ETH transaction data - we seed our search with our initial big players (obtained by top account value) as we expect them to have a large volume of transactions generally. We proceed to do a hybrid search (combining a BFS/DFS approach) to find relevant data for our graph. Since we're looking to have an illustrative and beautiful visualization, we want to show the activity between various different accounts in a web-like fashion, rather than just hub-and-spokes (with a central node). Doing a purely BFS or DFS approach doesn't lend itself well to such a visualization (you either get these "star-like" hub shapes with BFS or you get long chains with DFS), hence we decided to throw a bit of pseudo-randomness in our BFS to make it non-deterministically jump into some depths, which worked well.
On the frontend, we use ThreeJS to create force-directed graphs with the HTML Canvas and WebGL, which helps us handle larger amounts of data (from a rendering POV). d3-force-3d is the underlying engine backing the physics logic within the app. It's a React app built with TypeScript that uses Tailwind CSS for styling and is deployed with Vercel. We worked on optimizing our data to ensure that we had minimal lag - we were able to do this by de-duplicating transactions between two nodes (keeping the transaction data but collapsing it into a single edge, where we can send multiple partcles to visualize the transactions).