This was part of Opening Conference
Private AI: Machine Learning on Encrypted Data
Kristin Lauter, Microsoft Research
Wednesday, October 7, 2020
Abstract: As the world adopts Artificial Intelligence, the privacy risks are many. AI can improve our lives, but may leak or misuse our private data. Private AI is based on Homomorphic Encryption (HE), a new encryption paradigm which allows the cloud to operate on private data in encrypted form, without ever decrypting it, enabling private training and private prediction. Our 2016 ICML CryptoNets paper showed for the first time that it was possible to evaluate neural nets on homomorphically encrypted data, and opened new research directions combining machine learning and cryptography. The security of Homomorphic Encryption is based on hard problems in mathematics involving lattices, a candidate for post-quantum cryptography. Cyclotomic number rings are a good source of the lattices used in practice, which leads to new interesting problems in number theory. This talk will explain Homomorphic Encryption, Private AI, and show demos of HE in action.