Current resource allocation techniques in cellular networks are largely based on single-slope path loss model, which falls short in accurately capturing the effect of physical environment. The phenomenon of densification makes cell patterns more irregular therefore the multi-slope path loss model is more realistic to approximate the increased variations in the links and interferences. In this paper, we investigate the impacts of multi-slope path loss models, where different link distances are characterized by different path loss exponents. We propose a framework for joint user association, power and subcarrier allocation on the downlink of a heterogeneous network (HetNet). The proposed scheme is formulated as a weighted sum rate maximization problem, ensuring the users' quality of- service (QoS) requirements namely, users' minimum rate, and the base stations’ (BSs) maximum transmission power. We then compare the performance of the proposed approach under different path loss models to demonstrate the effectiveness of dual-slope path loss model in comparison to single-slope path loss model. Simulation results show that the dual-slope model leads to significant improvement in network's performance in comparison to the standard single-slope model by accurately approximating the path loss exponent dependence on the link distance. Moreover, it improves the user offloading from macrocell BS to small cells by connecting the users to nearby BSs with minimal attenuation. It has been shown that the path loss exponents significantly influence the user association lying across the critical radius in case of the dual-slope path loss model.
©2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.