Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

We targeted neural network classification and presented the first technique that can transform any common neural network model into a privacy-preserving one without any modifications to the training phase. We design oblivious protocols for operations routinely used by neural network designers: linear transformations, popular activation functions and pooling operations. In particular, we use polynomial splines to approximate nonlinear functions (e.g., sigmoid and tanh) with negligible loss in prediction accuracy. None of our protocols require any changes to the training phase of the model being transformed. We only use lightweight cryptographic primitives such as secret sharing and garbled circuits in online prediction phase. We also introduce an offline precomputation phase to perform request-independent operations using additively homomorphic encryption together with the SIMD batch processing technique. We show that our approach outperforms existing work in terms of response latency and message sizes. We demonstrate its wide applicability by transforming several typical neural network models trained from standard datasets. This work has been published in CCS ’17, one of the top tier conferences in security [5].


[1] Àgnes Kiss, Jian Liu, Thomas Schneider, N. Asokan, Benny Pinkas. Private Set Intersection for Unequal Set Sizes with Mobile Applications. In Proceedings of the 17th Privacy Enhancing Technologies Symposium (PETs), Minneapolis, USA , Pages 97-117, July 2017.

...

[5] Jian Liu, Mika Juuti, Yao Lu, N Asokan. Oblivious Neural Network Predictions via MiniONN Transformations. In Proceedings of the 24th ACM SIGSAC Conference on Computer and Communications Security (CCS), Dallas, Texas, USA, Pages xx-xx, October 2017.

[6] Sara Ramezanian, Tommi Meskanen, Valtteri Niemi. Privacy Preserving Queries on Directed Graph. submitted to NTMS'2018 - Security Track