Garble Cloud:

  • Redefines the scale out function of encryption
  • Client Side Encryption, User Defined Encryption


GarbleCloud technology differentiation:


Secure computation techniques (background): Techniques for computation over encrypted data have come a long way over the past decade -- first fully-homomorphic encryption scheme (invented by Craig Gentry), adaptation of secure multiparty computation techniques (DARPA PROCEED technology) and tamper-proof, trusted processors/software platforms (Intel SGX processors) have been developed. Based on one or more of these techniques experimental systems like CryptDB from MIT, commercial products from companies like Preveil, Enveil and Baffle have been developed. A couple of the companies in the trusted platform/runtime are Fortanix and Anjuna.


Most of these efforts started with theoretical "secure computation" schemes and tried to implement them in the context of real applications. However, scalability (and therefore broader applicability) remains a significant challenge for all these systems.


GarbleCloud approach to privacy-preserving search on encrypted data: Unlike the above-mentioned systems, we took the opposite approach -- we started with a database/Information Retrieval system and thought, how could we use their techniques to compute over data in a "privacy-preserving" manner. The goal was to come up with data representation (metadata) that exposed some "minimal" amount of information about the underlying dataset, but still enough to support efficient evaluation of the queries over large enough repositories.


Clearly, indexability of the metadata is a must here, since we wanted scalability. The "security level" of our scheme (measured in information theoretic metrics) was made parametric, i.e., it could be dialed down or up all the way as required (analogous to encryption-key length). We came up with a privacy-preserving, indexable metadata based scheme, that in the worst case was at least as good as "deterministic encryption" (vulnerable to dictionary attacks and frequency-based attacks) and at the other extreme is as safe as semantic security of randomized encryption schemes (gives no information whatsoever about the actual data).


There is, of course, a tradeoff between performance and level of security in our privacy-preserving scheme. But, even in the worst case, the security it offers is guaranteed to be at least as good as that offered by deterministic encryption schemes. As processor speeds and main-memory sizes keep increasing, it will be easier to deliver the same performance for increasing levels of security, approaching that of standard randomized encryption schemes.