One of the world’s biggest technological hurdles is the inability to safely and ethically share data. This issue hampers progress in all kinds of spheres like detecting cybercrime, governing nations, and even responding to natural disasters, the NewScientist said.
Now, a novel encryption method can truly wring the juice out of data without anyone ever seeing it, allowing Big Data to be used to its full extent while also resolving Big Data’s data privacy issues, resulting in positive implications for the future of personal and sensitive information.
“Data is the New Oil”
15 years ago, British mathematician and marketing expert Clive Humby coined the phrase “data is the new oil,” NewScientist wrote. That rings very true today, as Big Tech conglomerates like Meta and Alphabet “grew into multibillion-dollar behemoths by collecting information about us and using it to sell targeted advertising.”
Aside from it just being a goldmine for big businesses, data also makes us healthier, helps us react to natural disasters, and can provide extra support to students by determining whether they are likely to drop out, NewScientist wrote. However, to benefit fully from such data a lot of it is needed for analysis.
One of the most important areas where information is required is for criminal activity because data is essential to fight cybercrime. Perhaps even more important than that is medical data. The ability to seamlessly analyze medical data, particularly with new AI technologies, and share it ethically could reap a world of health benefits for everyone, Head of EPFL Fellay Laboratory Dr. Jacques Fellay told NewScientist.
However, raw data cannot be shared carelessly because it contains sensitive personal details. For that reason, there is an obligation to keep this data private, particularly because laws like the GDPR are an industry requisite today.
Preserving people’s privacy is more important than ever, as “seemingly insignificant nuggets [of personal data]” can be used to identify individuals if they are cross-referenced, or worse, that data can be sold to brokers and unnecessarily expose people, NewScientist said.
Advancements in Homomorphism
A sophisticated novel method of privacy preservation known as differential privacy manages to obfuscate personal data about people using statistical techniques to inject errors into answers but still has its limits, NewScientist wrote. A much more satisfactory answer to true privacy is a new development to the 40-year old encryption method, discovered in 1978 at MIT, known as “homomorphic encryption.” The method is akin to inviting people to put their arms into a precious glovebox holding a gemstone (the data) without having free access to it or the ability to steal it.
Since the 70s, the underlying mathematics of advanced encryption have been studied and optimized. FHE, or Fully Homomorphic Encryption, has been in development for 30 years, and “The promise is massive” and the opportunities endless, US Defense Advanced Research Projects Agency (DARPA) program manager Tom Rondeau said. In 2009, a Ph.D. student at Stanford University in California Craig Gentry made a breakthrough solution to FHE processing, which “was like putting one glovebox inside another, so that the first one could be opened while still encased in a layer of security,” NewScientist said.
Thanks to these contributions, IBM now has AI-enabled FHE tools that can detect fraudulent cybercrime using AI neural networks, and create data that is encrypted but able to be manipulated, simultaneously. This way, data remains secure in untrusted third-party environments, while also being future-proofed against quantum attacks. Adding to that, this process eliminates any tradeoffs between data privacy and data usability.
Combining FHE With Other Approaches is the Answer
Combining FHE with another technology known as Secure Multiparty Computation (SMC) has smoothed the sharing of medical data by joining up parts of that data in a way that private details cannot be retrieved from any organization. This novel method is already being utilized in Switzerland’s university hospitals to share patient data, NewScientist added.
For the moment, the limitation of such methods is computational speed due to the complexity of “lattice” encryption which puts this processing at 100,000 times slower when compared to unencrypted data, Rondeau added.
FHE will have to be combined with other approaches and is not a standalone winner, but is a “great addition to that toolbox,” privacy expert Yves-Alexandre de Montoye at Imperial College London said.
“FHE could be akin to a new mining technology, one that will open up some of the most valuable but currently inaccessible deposits,” NewScientist wrote.