Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Wireless Communication

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

1: Wireless Communication Systems

Preface
The scope of this deliverable is to provide an overview of modern wireless communication systems. This report has taken into consideration the popular cellular access technologies (GSM, UMTS) and WLAN. The basic characteristics and features of each technology are outlined, followed by the respective network architecture and measurements that enable the deployment of positioning techniques. The rest of this report is structured as follows. Section 1 provides introductory remarks, while Sections 2, 3 and 4 discuss the GSM, UMTS and WLAN networks, respectively. Finally, the B3G networks are described in Section 5.

1. Introduction
Wireless networks have significantly impacted the world, since their initial deployment. Wireless networks have continued to develop and their uses have significantly grown. Cellular phones are nowadays part of huge wireless network systems and people use mobile phones on a daily basis in order to communicate with each other and exchange information. Recently, wireless networks have been used for positioning as well, in order to enable the provision of location oriented services to the end-user. Different types of measurements available during standard network and terminal operation, mainly for resource management and synchronization purposes, can be employed to derive the users location.

2. Global System for Mobile communications


Global System for Mobile communications (GSM) is the most popular standard for mobile phones in the world. GSM is considered a second generation (2G) mobile phone system. Data communication was built into the system from the 3rd Generation Partnership Project (3GPP).

GSM is a cellular network, which means that mobile phones connect to it by searching for cells in the immediate vicinity. GSM networks operate in four different frequency ranges. Most GSM networks operate in the 900 MHz or 1800 MHz bands. Some countries, including Canada and the United States, use the 850 MHz and 1900 MHz bands because the 900 and 1800 MHz frequency bands were already allocated. In the 900 MHz band the uplink frequency band is 890-915 MHz, and the downlink frequency band is 935-960 MHz. This 25 MHz bandwidth is subdivided into 124 carrier frequency channels, each spaced 200 kHz apart. Time division multiplexing is used to allow eight full-rate or sixteen half-rate speech channels per radio frequency channel. There are eight radio timeslots (giving eight burst periods) grouped into what is called a Time Division Multiple Access (TDMA) frame. Half rate channels use alternate frames in the same timeslot. The channel data rate is 270.833 kbit/s, and the frame duration is 4.615 msec. The transmission power in the handset is limited to a maximum of 2 watts in GSM850/900 and 1 watt in GSM1800/1900. There are four different cell sizes in a GSM network - macro, micro, pico and umbrella cells. The coverage area of each cell varies according to the implementation environment. Macro cells can be regarded as cells where the base station antenna is installed on a mast or a building above average roof top level. Micro cells are cells whose antenna height is under average roof top level and they are typically used in urban areas. Pico cells are small cells whose coverage diameter is a few dozen meters and they are mainly used indoors. Umbrella cells are used to cover shadowed regions of smaller cells and fill in gaps in coverage between those cells. Cell horizontal radius varies depending on antenna height, antenna gain and propagation conditions from a couple of hundred meters to several tens of kilometers. The longest distance the GSM specification supports in practical use is 35 km. The high level structure of a GSM network, including only the key elements, is depicted in Figure 1. The Base Station Subsystem (BSS) consists of the Base Stations (BS) and the Base Station Controllers (BSC). The BSS is the section of a traditional cellular telephone network which is responsible for handling traffic and signalling between a mobile phone and the Network Switching

Subsystem. The BSS carries out transcoding of speech channels, allocation of radio channels to mobile phones, paging, quality management of transmission and reception over the air interface and many other tasks related to the radio network. The Network Switching Subsystem (NSS) is the component of a GSM system that carries out switching functions and manages the communications between mobile phones and the Public Switched Telephone Network (PSTN). The GPRS Core Network is an optional part which allows packet based Internet connections in 2.5G mobile phone systems. More specifically the Mobile Switching Centre (MSC) is a sophisticated telephone exchange centre which provides circuit-switched calling, mobility management, and GSM services to the mobile phones roaming within the area that it serves, while the Serving GPRS Support Node (SGSN) is responsible for the delivery of data packets from and to the Mobile Stations (MS) within its geographical service area.

Figure 1: General structure of a GSM network [1].

Based on this structure the basic location system architecture for GSM/GPRS is shown in Figure 2. Additional components include the Serving Mobile Location Center (SMLC) and the Gateway Mobile Location Center (GMLC). The SMLC is the GSM/GPRS network node that operates the location server software, while location data are available to Location Based Services (LBS) applications through the GMLC.

Figure 2: Location system architecture for GSM/GPRS networks [2].

Some location dependent parameters, which are always monitored by the terminal, can be used to perform positioning by using the standard Cell-ID technique. These parameters include Network identification through Mobile Country Code (MCC) and Mobile Network Code (MNC), Location Area Code (LAC) and unique identification of serving cell (CI). Timing measurements, available in the mobile networks, can also be employed to determine the position of the terminal. In the GSM standard, Timing Advance (TA) value corresponds to the length of time a signal from the mobile phone takes to reach the BS. GSM uses TDMA technology in the radio interface to share a single frequency between several users, assigning sequential timeslots to the individual users sharing a frequency. Each user transmits periodically for less than one-eighth of the time within one of the eight timeslots. Since the users are various distances from the base station and radio waves travel at the finite speed of light, the precise time at which the phone is allowed to transmit a burst of traffic within a timeslot must be adjusted accordingly. TA is the variable controlling this adjustment [3], [4]. The TA value is normally between 0 and 63, with each step representing an advance of one symbol period, which is approximately 3.69 microseconds. With radio waves travelling at about 300,000,000 metres per second, that is 300 metres per microsecond, one TA step then represents a change in roundtrip distance (twice the propagation range) of about 1,100 metres. This means that the TA value changes for each 550-metre change in the range between a mobile and the BS. The TA value has been used in order to enhance the accuracy provided by standard Cell-ID techniques. In GSM, time difference measurements which are called Observed Time Differences (OTD), can also be used for positioning purposes. Unlike TA

values, OTD measurements from several BSs are made by the terminal without forcing handover, which makes them more attractive for location. An experimental E-OTD network architecture is depicted in Figure 3. A terminal with modified software is able to report accurate OTD estimates by using sophisticated signal processing algorithms, e.g. multipath rejection, for finding the earliest arriving signal component. These OTD measurements are then sent via Short Message Service (SMS) to a Mobile Location Centre (MLC), which performs the location calculations. This is actually the SMLC described before. The synchronisation of the BSs is achieved by installing similar receivers as the terminal in known locations, typically at the BS sites. These receivers, shown in Figure 2, are known as Location Measurements Units (LMU) that measure Real Time Differences (RTD) between the BSs. Measured RTDs are also sent to the MLC via SMS. Disadvantages of this technique are the need for software modifications to the terminals and the need for additional equipment, i.e. installation of LMUs. LMUs should be placed everywhere in the network, where a location service is offered, at an average rate of 1 LMU per every 1.5 BS [5]. This deployment constraint comes from the requirement for each of the BSs in the network to be observed by at least one LMU. In operational use, the information transfer will take place using specific signalling messages instead of SMS.

Figure 3: Network architecture to support E-OTD.

Received Signal Strength (RSS) measurements collected by monitoring the Broadcast Control Channel (BCCH), where transmission is performed with constant power, can also be utilized to support positioning in GSM networks. This kind of measurements is collected by the terminal as part of its standard

functionality. While the terminal is in In-session 1 mode, all location dependent parameters, described before and RSS measurements are known at the terminal side, since the RSS values from the Serving cell and up to six neighboring cells are reported back to the Network through Network Measurement Reports (NMRs) every 480msec for handover and power control purposes. These reports also contain the Base Station Identity Code (BSIC), which uniquely identifies a neighboring cell when combined with the BCCH value. In the Idle-attached 2 mode the terminal continuously makes RSS measurements and also knows the CI of the serving cell and the CI or BCCH + BSIC of the neighboring cells. This is a standard functionality according to GSM specifications, to help in the cell selection and re-selection operations. The RSS values are averaged over a period of at least five seconds. According to GSM specifications [6] the RSS level at the receiver input is measured by the terminal over the full range of -110 dBm to -48 dBm with an absolute accuracy of 4 dB from -110 dBm to -70 dBm under normal conditions and 6 dB over the full range under both normal and extreme conditions. A resolution of 1 dBm is used, which results in 64 possible values and therefore the RSS values are provided in the [0, 63] range.

3. Universal Mobile Telecommunications System


Universal Mobile Telecommunications System (UMTS) is one of the thirdgeneration (3G) mobile phone technologies. Currently, the most common form uses Wideband Code Division Multiple Access (W-CDMA) as the underlying air interface and is standardized by the 3GPP. UMTS, using W-CDMA, supports up to 14.0 Mbit/s data transfer rates in theory with High-Speed Downlink Packet Access (HSDPA), although at the moment users in deployed networks can expect a performance up to 384 kbit/s for R99 terminals, and 3.6 Mbit/s for HSDPA terminals in the downlink connection. This is still much greater than the 9.6 kbit/s of a single GSM error-corrected circuit switched data channel. The UMTS spectrum allocated in Europe is already used in
The terminal is engaged to a wireless session, through a dedicated communication channel. The terminal is attached to a wireless communication system, but no communication session is active.
2 1

North America. The 1900 MHz range is used for 2G services, and 2100 MHz range is used for satellite communications. Regulators have however, freed up some of the 2100 MHz range for 3G services, together with the 1700 MHz for the uplink. UMTS operators in North America who want to implement a European style 2100/1900 MHz system will have to share spectrum with existing 2G services in the 1900 MHz band. 2G GSM services elsewhere use 900 MHz and 1800 MHz and therefore do not share any spectrum with planned UMTS services. In the Universal Terrestrial Radio Access Network (UTRAN), a handset (terminal) is called User Equipment (UE) and a base station is called node B. There are two operational modes for UTRAN: Frequency-Division Duplex (FDD) and Time-Division Duplex (TDD). The original standards specifications were developed based on FDD mode. Figure 4 illustrates the system architecture of UE positioning (UP). The UTRAN interfaces (Uu, Iub, Iur, and Iupc) are used to communicate among all relevant entities. In this figure SRNC stands for Serving Radio Network Controller, LMU for Location Measurement Unit, SAS for StandAlone Serving Mobile Location Center (SMLC), and CN for Core Network. LMU type A is a standalone LMU, while type B is integrated with a node B. The Radio Network Controllers (RNCs) are in charge of the network resources managing the node Bs and specific LMUs in the location process. The SRNC works as the SMLC and receives the location request from external LBS application or LBS Client Function in the CN. SRNC both co-ordinates and controls the overall UE positioning.

Figure 4: System architecture of UE positioning [7].

The SRNC is shown in more detail in Figure 5. The LBS System Operation Function (LSOF) works as a database of the needed information in UE position calculations, e.g. the geographic locations of the node Bs, or in other network operations during the location process. The LBS Server Control Function (LSCF) requests the needed measurements from the UE, LMU or one or more node Bs and sends the results to the appropriate Position Calculation Function (PCF) in the network. The PCF makes the needed coordinate transformations for the UE location estimate. For every location estimate result the PCF estimates the QoS level regarding the achieved accuracy and sends it together with the time-of-day information about the measurement as a part of the reported result. The SRNC can also use the UE location information internally e.g. for location-aided handover [8]. The logical positioning architecture in UMTS does not depend on a single measurement and location technique, but it is able to perform with the standard measurements and techniques available.

Figure 5: Components comprising the SRNC.

The position of the UE can be estimated by using the coverage information of its serving node B, in a Cell-ID based method. This knowledge can be obtained by paging, locating area update, cell update, UTRAN registration area (URA) update, or routing area update. Depending on the operational status of the UE, additional operations may be needed in order for the SRNC to determine the cell ID. When the LBS request is received from the CN, the SRNC checks the state of the target UE. If the UE is in a state where the cell ID is not available, the UE is paged so that the SRNC can establish the cell with which the target UE is associated. In states where the cell ID is available, the target cell ID is chosen as the basis for the UE positioning. In soft handover, the UE may have several signal branches connected to different cells while reporting different cell IDs. The SRNC needs to combine the information about all cells associated with the UE to determine a proper cell

ID. The SRNC should also map the cell ID to geographical coordinates or a corresponding Service Area Identity (SAI) before sending it from the UTRAN to the CN. This can easily match the service coverage information available in the CN [7]. In order to improve the accuracy of the LBS response, the SRNC may also request additional measurements from node B or the LMU. These measurements are originally specified for soft handover. For FDD mode, Round-Trip Time (RTT) can be used as a radius of a cell to further confine the cell coverage. RTT is the time difference between the transmission of the beginning of a downlink Dedicated Physical Channel (DPCH) frame to a UE and the reception of the beginning of the corresponding uplink from the UE. For TDD mode, received (Rx) timing deviation can be used. Rx timing deviation is the time difference between the reception in node B of the first detected uplink path and the beginning of the respective slot according to the internal timing of node B. The measurements are reported to higher layers, where timing advance values are calculated and signalled to the UE. UMTS bandwidth is 5 MHz and it operates at a high chip rate 3.84 Mcps/s, which contributes to the better resolution in timing measurements compared to GSM. The timing resolution in UMTS with one sample per chip is 0.26 s which corresponds to the propagation distance of 78 m. RSS measurements are also available in UMTS networks. The most suitable channel for the location measurements in the UE is the primary downlink Common Pilot Channel (CPICH) whose Received Signal Code Power (RSCP) reception level at the UE is used for handover evaluation, downlink and uplink open loop power control and for the pathloss calculations [9]. Every node B sends the Primary CPICH with a fixed data rate of 30 kbps and it is unique for each cell or sector. It is always present on air, under the primary scrambling code with a fixed channel code allocation, thus, the CPICH can be measured at any time by the UEs. The transmission power of CPICH is set to 10% of the maximum i.e. 33 dBm, which makes it possible to use readily available measurements at the UE made on CPICH also for location purposes. In UMTS, RSS measurements may be slightly more reliable due to the wider bandwidth, which allows better smoothing of fast fading. On the other hand,

the hearability problem prevents measurements of as many neighbouring BSs as it is possible in GSM. The standardised measurements suitable for UE positioning in the UTRANFDD mode [10] are presented in Table 1. Similar measurements are available also in UTRA-TDD mode [11]. Possible UE modes for making the measurements are also presented. In idle mode the UE is not actively processing a call. Connected UE mode involves two cases, in an intrafrequency mode the measurements are made within one carrier frequency and in an inter-frequency mode the carrier frequencies between the measured system and the UE are different e.g. the measurements made from the GSM cells. UE measurements RSCP, on CPICH (for TDD cells separate measurement) RSS Indicator (RSSI) on a DL carrier UE Transmitted power SFN-SFN observed time difference on CPICH, Type2 (SFN = System Frame Number), relative time difference between cell i and j UE Rx-Tx time difference, for each cell in the active set UTRAN measurements Transmitted carrier power Transmitted code power, power on the pilot bits of the DPCCH field RTT (TRX-TTX) at Node B SFN-SFN observed time difference on CPICH, measured by a LMU UE mode of operation Idle & Connected Intra/Inter Idle & Connected Intra/Inter Connected Intra Idle & Connected Intra/Inter Connected Intra Applicable Location Technique Signal Strength Signal Strength As a reference level OTDoA

OTDoA

Applicable Location Technique Signal Strength As a reference OTDoA and hybrids OTDoA

Table 1: Standardized measurements by the UE and UTRAN in FDD mode.

In connected mode the UE continuously measures the detected, usually eight, intra-frequency cells and searches for new intra-frequency cells in the monitoring set, which can involve up to 32 intra- or inter-frequency cells [9]. These intra-frequency measurements and reporting of the results are typically made with a 200 ms period if no other measurements have been requested. If needed, a specific measurement, e.g. Rx-Tx time difference, may be requested by the UTRAN. The request involves e.g. the measurement ID,

type, reporting quantities and criteria [10] e.g. requirements of periodic or event-triggered reporting. With event-triggered reporting the UE shall not return the measurement result report until the reporting criteria, i.e. the required accuracy level set in [9] for the measurements, is fulfilled. For intrafrequency and UE internal measurements, specific events can be defined to trigger the MT to report the UTRAN.

4. Wireless Local Area Network


Wireless Local Area Network (WLAN), also known as IEEE 802.11, is a set of standards that enable over-the-air communication in medium range distances (approximately 30-150 m). The 802.11 family currently includes multiple modulation techniques that all use the same basic protocol. The most popular techniques are those defined by the b/g and are amendments to the original standard. 802.11b and 802.11g standards use the 2.4 GHz band. Because of this choice of frequency band, 802.11b and 802.11g equipment could occasionally suffer interference from microwave ovens, cordless telephones, or Bluetooth devices. The 802.11a standard uses a different 5 GHz band, which is reasonably free from interference. 802.11a devices are never affected by products operating on the 2.4 GHz band. The current radio standard version 802.11b updates the link rate to 11 Mbits. The modulation is Complementary Code Keying (CCK) that is based on the original Direct Sequence Spread Spectrum (DSSS) modulation of the 802.11 physical layer. The high rate radio is backwards compatible with the DSSS radio. IEEE 802.11b operates on 2.4000-2.4835 GHz Industrial, Scientific and Medical (ISM) band, which is license-free and available almost globally. The 802.11 standard defines both ad-hoc and infrastructure topologies. In the ad-hoc topology, mobile devices communicate on a peer-to-peer basis, whereas in the infrastructure topology the Access Point (AP) is the central control point, which forwards traffic between terminals of the same cell and bridges traffic to wired LAN. In the latter case an 802.11 wireless network is based on a cellular architecture where the system is subdivided into cells. Each cell is called Basic Service Set (BSS) and is controlled by an AP. Even

though a WLAN may be formed by a single cell, most installations are formed by several cells, where the APs are connected through a backbone structure called Distribution System (DS), typically Ethernet and in some cases wireless itself. The whole interconnected WLAN including the different cells, their respective APs and the DS, is seen to the upper layers as a single network, which is called Extended Service Set (ESS). The typical WLAN architecture, including all the components described previously is depicted in Figure 6 [12].

Figure 6: WLAN architecture.

IEEE 802.11 defines the maximum of 100 mW transmit power in Europe. This results into cell size of tens of meters indoors and over a hundred meters outdoors. Consequently positioning based on Cell-ID is inaccurate, but a number of overlapping WLAN cells should improve the accuracy. However, it is unlike that the density of overlapping APs is very high in real life conditions. This is because of the high network throughput that can serve a number of client terminals, the relatively high price of AP and narrow frequency band, which allow only three separate networks or APs to coexist without interference. Beacon frames are transmitted in IEEE 802.11 WLAN for network identification, broadcasting network capabilities, synchronization and for other control and management purposes. In the infrastructure topology APs transmit beacons, which are repeated periodically, according to the beacon interval parameter. IEEE 802.11 defines a synchronization function that keeps timers of all terminals synchronized. In the infrastructure topology, all terminals synchronize to the AP clock by using the timestamp information of beacon frames. The timer resolution is 1 sec, which is too inaccurate for the TOA positioning. A 1 sec error in timing equals to a 300 m error in distance

estimate. In addition the synchronization algorithm of 802.11 maintains the synchronization at the accuracy of 4 s [13]. The IEEE 802.11 Medium Access Control (MAC) protocol utilizes carrier sensing based contention. The carrier sensing is based on energy detection or signal quality. The standard specifies the Received Signal Strength Indicator (RSSI) that measures Radio Frequency (RF) energy received by the radio. In an IEEE 802.11 system, RSSI can be used internally in a wireless networking card to determine when the amount of radio energy in the channel is below a certain threshold at which point the network card is Clear-To-Send (CTS). Once the card is clear to send, a packet of information can be sent. The end-user will likely observe an RSSI value when measuring the signal strength of a wireless network through the use of a wireless network monitoring tool like Network Stumbler. RSSI measurements vary from 0 to 255 depending on the vendor. It consists of a one byte integer value. A value of 1 will indicate the minimum signal strength detectable by the wireless card, while 0 indicates no signal. The value has a maximum of RSSI_Max. For example, Cisco Systems cards will return a RSSI of 0 to 100. In this case, the RSSI_Max is 100. The Cisco card can report 101 distinct power levels. Another popular WLAN chipset is made by Atheros and an Atheros based card will return a RSSI value of 0 to 60. The subtlety of 802.11 RSSI comes from how it is sampled. RSSI is acquired during the preamble stage of receiving an 802.11 frame. To this extent 802.11 RSSI has (for the most part) been replaced with Received Channel Power Indicator (RCPI), a functional measurement covering the entire received frame with defined absolute levels of accuracy and resolution. RCPI is an 802.11 measure of the received RF power in a selected channel over the preamble and the entire received frame. It is defined in IEEE 802.11k, which is a proposed standard for Radio Resource Management (RRM) in WLAN.

5. Beyond Third Generation (B3G)


A typical Beyond 3G (B3G) system is composed of different access network technologies, supporting, for example, cellular, WLAN, and broadcast access. It relies on enhanced IP networking technologies as the unified networking layer to provide equal network services to all IP-based applications across heterogeneous access networks. This effectively extends the boundary of mobility in an application-transparent manner, thus enlarging the scope of service availability and accessibility to the users. The case of a mobile network in a B3G system is illustrated in Figure 7.

Figure 7: A mobile network in a B3G system.

An indicative positioning methodology, applicable in B3G networks, is depicted in Figure 8 for a terminal moving through an area covered by multiple radio access networks [14]. While in idle mode the terminal periodically stores all available network measurements, thus forming a list of location related information. Each entry in this list contains the actual measurements and a special field, which indicates the corresponding type of access technology, e.g. GSM, UMTS and WLAN RSS values along with the Cell-IDs and AP identities. This can also be extended in order to include all available measurements required by other positioning techniques. Each entry is also time-stamped and in this way a record of historical information reflecting the terminals motion is actually created. When the list is full,

updating is performed in a sliding window fashion by discarding the oldest and incorporating the current measurement. In the future, multi-homed terminals will have the ability to be simultaneously attached to several wireless access networks. Since these monitoring procedures are part of the terminals standard functionality, adding a software component to handle the list management is the only modification required at the terminal side. When an LBS session is established, the measurement list, possibly augmented with additional information available during active mode such as TA for GSM, is uploaded to the Positioning Server. Based on the type of collected measurements the most appropriate positioning algorithm can be selected, to provide a rough position estimation for each entry in the list. Subsequently, post processing techniques can be used, to smooth the positioning error in the sequence of position estimations and increase the accuracy of the current terminals position.

Figure 8: Using measurements available in B3G networks for positioning.

References
[1] [2] [3] [4] [5] [6] [7] [8] [9] www.wikipedia.org Snaptrack, Location Techniques for GSM, GPRS and UMTS Networks, White Paper, 2003 3GPP TS 05.10 Radio Subsystem Synchronization 3GPP TS 45.010 Radio Subsystem Synchronization GSM Mobile Location Systems; Omnipoint Technologies, Inc.; Document #0710009-00B, 1999 3GPP TS 05.08 Radio Subsystem Link Control (version 8.17.0 Release 1999) Zhao Y., Standardization of mobile phone positioning for 3G systems, IEEE Communications Magazine, 2002, 40, pp. 108-116 3GPP, Technical Specification Group Radio Access network, Stage 2 Functional Specification of UE Positioning in UTRAN (3G TS 25.305 version 5.0.0), 2001. 3GPP, Technical Specification Group Radio Access Networks, Requirements for Support of Radio Resource Management (FDD) (3G TS 25.133 version 4.0.0), 2001. 3GPP, Technical Specification Group Radio Access Network, Physical layerMeasurements (FDD) (3G TS 25.215 version 4.0.0), 2001. 3GPP, Technical Specification Group Radio Access Network, Physical layerMeasurements (TDD) (3G TS 25.225 version 4.0.0), 2001. Pablo Brenner, A Technical Tutorial on the IEEE 802.11 Protocol, BreezeCom Wireless Communications, 1997. Antti Kotanen et al., Positioning with IEEE 802.11b Wireless LAN, PIMRC, 2003. C. Laoudias, C. Panayiotou, C. Desiniotis, J. G. Markoulidakis, J. Pajunen, S. Nousiainen, STAMP: A Positioning Methodology and Architecture, submitted to Computer Communications Special Issue on Advanced LBS.

[10] [11] [12] [13] [14]

You might also like