Home > Research > Publications & Outputs > Privacy-preserving federated deep learning for ...

Electronic data

  • Privacy-Preserving_Federated_Deep_Learning_for_Cooperative_Hierarchical_Caching_in_Fog_Computing

    Rights statement: ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 1.95 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print
Close
<mark>Journal publication date</mark>18/05/2021
<mark>Journal</mark>IEEE Internet of Things Journal
Number of pages10
Publication StatusE-pub ahead of print
Early online date18/05/21
<mark>Original language</mark>English

Abstract

Over the past few years, Fog Radio Access Networks (F-RANs) have become a promising paradigm to support the tremendously increasing demands of multimedia services, by pushing computation and storage functionalities towards the edge of networks, closer to users. In F-RANs, distributed edge caching among Fog Access Points (F-APs) can effectively reduce network traffic and service latency as it places popular contents at local caches of F-APs rather than the remote cloud. Due to the limited caching resources of F-APs and spatio-temporally fluctuant content demands from users, many cooperative caching schemes were designed to decide which contents are popular and how to cache them. However, these approaches often collect and analyse the data from Internet-of-Things (IoT) devices at a central server to predict the content popularity for caching, which raises serious privacy issues. To tackle this challenge, we propose a Federated Learning based Cooperative Hierarchical Caching scheme (FLCH), which keeps data locally and employs IoT devices to train a shared learning model for content popularity prediction. FLCH exploits horizontal cooperation between neighbour F-APs and vertical cooperation between the BaseBand Unit (BBU) pool and F-APs to cache contents with different degrees of popularity. Moreover, FLCH integrates a differential privacy mechanism to achieve a strict privacy guarantee. Experimental results demonstrate that FLCH outperforms five important baseline schemes in terms of the cache hit ratio, while preserving data privacy. Moreover, the results show the effectiveness of the proposed cooperative hierarchical caching mechanism for FLCH.

Bibliographic note

©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.