Home > Research > Publications & Outputs > Privacy-preserving federated deep learning for ...

Electronic data

  • Privacy-Preserving_Federated_Deep_Learning_for_Cooperative_Hierarchical_Caching_in_Fog_Computing

    Rights statement: ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 1.95 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing. / Yu, Zhengxin; Hu, Jia; Min, Geyong et al.
In: IEEE Internet of Things Journal, 18.05.2021.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Yu, Z., Hu, J., Min, G., Wang, Z., Miao, W., & Li, S. (2021). Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing. IEEE Internet of Things Journal. Advance online publication. https://doi.org/10.1109/JIOT.2021.3081480

Vancouver

Yu Z, Hu J, Min G, Wang Z, Miao W, Li S. Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing. IEEE Internet of Things Journal. 2021 May 18. Epub 2021 May 18. doi: 10.1109/JIOT.2021.3081480

Author

Yu, Zhengxin ; Hu, Jia ; Min, Geyong et al. / Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing. In: IEEE Internet of Things Journal. 2021.

Bibtex

@article{6a09da2ddcb947b3b82132b3f7396f69,
title = "Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing",
abstract = "Over the past few years, Fog Radio Access Networks (F-RANs) have become a promising paradigm to support the tremendously increasing demands of multimedia services, by pushing computation and storage functionalities towards the edge of networks, closer to users. In F-RANs, distributed edge caching among Fog Access Points (F-APs) can effectively reduce network traffic and service latency as it places popular contents at local caches of F-APs rather than the remote cloud. Due to the limited caching resources of F-APs and spatio-temporally fluctuant content demands from users, many cooperative caching schemes were designed to decide which contents are popular and how to cache them. However, these approaches often collect and analyse the data from Internet-of-Things (IoT) devices at a central server to predict the content popularity for caching, which raises serious privacy issues. To tackle this challenge, we propose a Federated Learning based Cooperative Hierarchical Caching scheme (FLCH), which keeps data locally and employs IoT devices to train a shared learning model for content popularity prediction. FLCH exploits horizontal cooperation between neighbour F-APs and vertical cooperation between the BaseBand Unit (BBU) pool and F-APs to cache contents with different degrees of popularity. Moreover, FLCH integrates a differential privacy mechanism to achieve a strict privacy guarantee. Experimental results demonstrate that FLCH outperforms five important baseline schemes in terms of the cache hit ratio, while preserving data privacy. Moreover, the results show the effectiveness of the proposed cooperative hierarchical caching mechanism for FLCH.",
author = "Zhengxin Yu and Jia Hu and Geyong Min and Zi Wang and Wang Miao and Shancang Li",
note = "{\textcopyright}2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. ",
year = "2021",
month = may,
day = "18",
doi = "10.1109/JIOT.2021.3081480",
language = "English",
journal = "IEEE Internet of Things Journal",
issn = "2327-4662",
publisher = "IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC",

}

RIS

TY - JOUR

T1 - Privacy-preserving federated deep learning for cooperative hierarchical caching in fog computing

AU - Yu, Zhengxin

AU - Hu, Jia

AU - Min, Geyong

AU - Wang, Zi

AU - Miao, Wang

AU - Li, Shancang

N1 - ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PY - 2021/5/18

Y1 - 2021/5/18

N2 - Over the past few years, Fog Radio Access Networks (F-RANs) have become a promising paradigm to support the tremendously increasing demands of multimedia services, by pushing computation and storage functionalities towards the edge of networks, closer to users. In F-RANs, distributed edge caching among Fog Access Points (F-APs) can effectively reduce network traffic and service latency as it places popular contents at local caches of F-APs rather than the remote cloud. Due to the limited caching resources of F-APs and spatio-temporally fluctuant content demands from users, many cooperative caching schemes were designed to decide which contents are popular and how to cache them. However, these approaches often collect and analyse the data from Internet-of-Things (IoT) devices at a central server to predict the content popularity for caching, which raises serious privacy issues. To tackle this challenge, we propose a Federated Learning based Cooperative Hierarchical Caching scheme (FLCH), which keeps data locally and employs IoT devices to train a shared learning model for content popularity prediction. FLCH exploits horizontal cooperation between neighbour F-APs and vertical cooperation between the BaseBand Unit (BBU) pool and F-APs to cache contents with different degrees of popularity. Moreover, FLCH integrates a differential privacy mechanism to achieve a strict privacy guarantee. Experimental results demonstrate that FLCH outperforms five important baseline schemes in terms of the cache hit ratio, while preserving data privacy. Moreover, the results show the effectiveness of the proposed cooperative hierarchical caching mechanism for FLCH.

AB - Over the past few years, Fog Radio Access Networks (F-RANs) have become a promising paradigm to support the tremendously increasing demands of multimedia services, by pushing computation and storage functionalities towards the edge of networks, closer to users. In F-RANs, distributed edge caching among Fog Access Points (F-APs) can effectively reduce network traffic and service latency as it places popular contents at local caches of F-APs rather than the remote cloud. Due to the limited caching resources of F-APs and spatio-temporally fluctuant content demands from users, many cooperative caching schemes were designed to decide which contents are popular and how to cache them. However, these approaches often collect and analyse the data from Internet-of-Things (IoT) devices at a central server to predict the content popularity for caching, which raises serious privacy issues. To tackle this challenge, we propose a Federated Learning based Cooperative Hierarchical Caching scheme (FLCH), which keeps data locally and employs IoT devices to train a shared learning model for content popularity prediction. FLCH exploits horizontal cooperation between neighbour F-APs and vertical cooperation between the BaseBand Unit (BBU) pool and F-APs to cache contents with different degrees of popularity. Moreover, FLCH integrates a differential privacy mechanism to achieve a strict privacy guarantee. Experimental results demonstrate that FLCH outperforms five important baseline schemes in terms of the cache hit ratio, while preserving data privacy. Moreover, the results show the effectiveness of the proposed cooperative hierarchical caching mechanism for FLCH.

U2 - 10.1109/JIOT.2021.3081480

DO - 10.1109/JIOT.2021.3081480

M3 - Journal article

JO - IEEE Internet of Things Journal

JF - IEEE Internet of Things Journal

SN - 2327-4662

ER -