AsymML: An Asymmetric Decomposition Framework for Privacy-Preserving DNN
Training and Inference
AsymML: An Asymmetric Decomposition Framework for Privacy-Preserving DNN
Training and Inference
Leveraging parallel hardware (e.g. GPUs) to conduct deep neural network (DNN) training/inference, though significantly speeds up the computations, raises several data privacy concerns. Trusted execution environments (TEEs) have emerged as a promising solution to enable privacy-preserving inference and training. TEEs, however, have limited memory and computation resources which renders it …