The long-term evolution (LTE) cellular standard has become a ubiquitous communications protocol for commercial mobile broadband and communication services. The LTE protocol was designed to outperform its 3rd generation (3G) predecessors such as universal mobile telecommunications system (UMTS), global system for mobile communications (GSM) and code division multiple access (CDMA). Design goals of LTE also known as 4th generation (4G) systems are to increase downlink and uplink data rates, provide scalable bandwidth, improve radiometric efficiency, include an all internet protocol (IP) network, and support many user types. The federal government has undertaken several initiatives to open up federal spectrum for commercial use and share use with commercial and federal services to satisfy the increasing requests for more spectrum to expand LTE services. One of those areas is in the advanced wireless services 3 (AWS-3) bands (i.e., 1695-1710 MHz, 1755-1780 MHz, and2155-2180 MHz). Initially, federal and early commercial entrants will be sharing the 1755-1780 MHz band. The Defense Spectrum Organization (DSO) sponsored a spectrum sharing project executed by the National Advanced Spectrum and Communications Test Network (NASCTN). The NASCTN project seeks to perform rigorous tests to determine the factors that have the biggest effect on LTE user equipment (UE) behavior with the goal to provide statistically significant data for DSO to improve their decision-making process. To aid the NASCTN project, an understanding of real-world LTE network path loss must be considered. Quantifying LTE communications network performance requires an understanding of network sizes, antenna heights, and the ground cover that the network is expected to operate in and how that ground cover effects the ability of LTE radiowave signals to propagate. This paper provides a methodology and results used to estimate eNB cell tower radii and associated path loss for various ground cover morphologies.
展开▼