Securely guarantee omni-present data integrity and authenticity, using MPC. We improve on TLS Notary, solving notary centralisation & collusion risk by scaling computation to a network of notaries within the same session.
Due to the way in which TLS (Transport Layer Security) is architected for interactions on the internet, people interacting with servers on the internet can be confident of the privacy and identity of the server that they are using. During the session, https keeps the browsing period private and users can verify that the server is who they claim to be due to the public key of the server. Without this core primitive, the internet as we know it would not be able to exist.
However, what TLS does not include is a signature from the server to the client that can be taken to another website as proof that the public key of the server has transmitted certain information at a certain time to the client in question. This massively limits the interoperability of the internet, creating huge fragmentation between different products, service providers, and networks. So, for example, it can be very difficult if I have 10,000 followers on Twitter, and want to indisputably prove that on Reddit. Lack of data portability effectively creates ‘vendor lock-in’ and undermines any sovereignty claims of a user over their data.
That is, until a group of researchers from the PSE at the Ethereum Foundation came together to build TLS Notary, a groundbreaking new technology which allows, without requiring the consent (or even awareness) of the server / data custodian in question, for the splitting of the key received on the client side into two shards, one for the user and another for a trusted party, known as the ‘notary’. The notary and the client then combine their shards to create a shared signature, effectively an attestation that X data was received from server Y at time Z. The data sharing is asymmetric in what it reveals, meaning that the notary never actually needs to know the specific information that the server is sending. This has recently received some hype on Twitter and led to the proliferation of the phrase ‘zkTLS’, a catch-all term to describe companies building on and around TLS Notary and its thematic equivalents.
Despite the explicit insistence of the TLS Notary team, both on their docs and in their public talks that they are not trying to solve the problem of notary centralisation/collusion, only contributing in a novel way to TLS architecture, many teams have swept that concern under the rug and suggested that this is not really an issue, essentially saying ‘just trust me bro’. We were not sure how big of an issue this was, but after hearing the TLS Notary team talk at DevCon and realising that they were keen to hear from any teams working on solving this, we decided that we should explore the space. Its implications are profound: unless we can achieve true verifiability of data from TLS Notary, it will always be limited to either trust (not scalable for a global system without uniformly enforced laws etc), or use case (ie only use cases that require very low stakes, and therefore typically inconsequential data). So, under the current architecture suggested, we can use a centralised notary to prove that we did a 15-minute run today on Strava with minimal collusion risk, but we cannot prove that Alice sent a $100,000 bank transfer to Bob with the same confidence.
We discovered that the current methods employed for MPC-TLS in a two-party setting are in fact suboptimal for scaling the number of notaries involved in the session, and this was the point at which most teams seem to have given up on this. The problem is that the communication overhead that adding notaries entails is computationally expensive and scales very poorly, making attempts to increase the security of the protocol additively seem futile. However, this is due to the implementation of TLS Notary that relies on DEAP (Dual Encryption, Asymmetric Privacy), a cryptographic protocol that was originally designed for two-party interactions only, and thus never meant to scale.
Our intention with securing the data sources is to create the most robust possible implementation of the notary interactions, with the option to use as many notaries as necessary, effectively ‘NPC-MPC’, with ‘N’ being theoretically unbounded. So, instead of using DEAP, we built on new research from Aarhus University in Denmark, called MAESTRO (Multi-Party Advanced Encryption Standard), who applied and improved the analysis of a batch verification technique for checking inner products with logarithmic communication. This means that we can obtain malicious security with almost no communication overhead, something that is not possible using DEAP.
Based on assumptions made in our modelling, the increase in security guarantee that this leads to is dramatic: assuming that current implementations have a collusion risk of 1 in 1000, and that our network consists of 10 notaries, our collusion risk would decrease to 3.5*10 to the minus 11, or one in 100 billion.
We are extremely excited about the financial use cases for creating ‘TLS proofs’, and believe that this innovation in the security guarantees will enable not only crypto-natives, who are comfortable game-theoretic / optimistic guarantees, but also other, risk-averse stakeholders to participate in and leverage the increased data availability, without compromising systems or privacy. We categorise this new wave of products as ‘TLSfi’ and can’t wait to see new teams building in this area.
Full code is written in Rust.
Main packages used as a foundation are:
It's notable that the maestro package is not meant for production so I hacked it together. A simple example was created for a three party computation but this can be expanded to N parties in the future.