THESIS
2020
Abstract
With the increasing awareness of privacy protection and data fragmentation problem, federated
learning has been emerging as a new paradigm of machine learning. Federated
learning tends to utilize various privacy preserving mechanisms to protect the transferred
intermediate data, among which homomorphic encryption strikes a balance between security
and ease of utilization. However, the complicated operations and large operands
impose significant overhead on federated learning. Maintaining accuracy and security
more efficiently has been a key problem of federated learning. In this work, we investigate
a hardware solution, and design an FPGA-based homomorphic encryption framework,
aiming to improve the throughput of the training phase in federated learning. The
framework implement...[
Read more ]
With the increasing awareness of privacy protection and data fragmentation problem, federated
learning has been emerging as a new paradigm of machine learning. Federated
learning tends to utilize various privacy preserving mechanisms to protect the transferred
intermediate data, among which homomorphic encryption strikes a balance between security
and ease of utilization. However, the complicated operations and large operands
impose significant overhead on federated learning. Maintaining accuracy and security
more efficiently has been a key problem of federated learning. In this work, we investigate
a hardware solution, and design an FPGA-based homomorphic encryption framework,
aiming to improve the throughput of the training phase in federated learning. The
framework implements the representative Paillier homomorphic cryptosystem with high
level synthesis for flexibility and portability, performs careful optimization on the modular
multiplication operation, delivering a tight scheduling, good resource-efficiency and
low host communication overhead. Our accelerator achieves a near-optimal execution
clock cycle, with a better DSP-efficiency than existing designs, and reduces the encryption
time by up to 71% during training process of various federated learning models.
Post a Comment