Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Daft QoS QoE-Guidelines (Rev1407)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

INFORMATION COMMUNICATION

TECHNOLOGIES QUALITY OF
SERVICE AND
QUALITY OF EXPERIENCE
GUIDELINES

Date: September 2022

0
Table of Contents
PART I ................................................................................................................................... 3
PRELIMINARY PROVISIONS ....................................................................................................5
OBJECTIVES .....................................................................................................................................................5

PART II .................................................................................................................................. 6
DEFINITIONS ..............................................................................................................................7
A. TECHNICAL PARAMETERS ...............................................................................................................10
B. NON-TECHNICAL PARAMETERS .....................................................................................................14

PART III ............................................................................................................................... 15


COMMUNICATION SERVICE LICENSEE OBLIGATIONS..................................................... 15
PART IV ............................................................................................................................... 15
COMPLIANCE ........................................................................................................................... 15
Monitoring .........................................................................................................................................................16
Inspection .........................................................................................................................................................16
Enforcement .....................................................................................................................................................17

PART V ................................................................................................................................ 17
REPORTING .............................................................................................................................. 17
Reporting ..........................................................................................................................................................17
Record Keeping ...............................................................................................................................................17
Auditing .............................................................................................................................................................18
Publication ........................................................................................................................................................18

PART VI ............................................................................................................................... 18
SERVICE INTERRUPTIONS .................................................................................................... 19
Planned Service Interruptions .......................................................................................................................19
Unplanned Service Interruptions ...................................................................................................................19

PART VII .............................................................................................................................. 20


REVIEW ..................................................................................................................................... 20
PART VIII ............................................................................................................................. 20
IMPLEMENTATION................................................................................................................... 20
SCHEDULES .......................................................................................................................... 20
SCHEDULE 1 ......................................................................................................................... 21
QUALITY OF SERVICE PUBLIC SWITCHED TELEPHONE SERVICES ............................................. 21
SCHEDULE 2 ......................................................................................................................... 22
QUALITY OF SERVICE PARAMETR FOR MOBILE SERVICES ...................................................... 22
SCHEDULE 3 ........................................................................................................................ 24

1
QUALITY OF SERVICE PARAMETERS FOR FIXED INTERNET SERVICES ...................................... 24
SCHEDULE 4 ......................................................................................................................... 27
QUALITY OF SERVICE PARAMETERS FOR MOBILE INTERNET SERVICES .................................. 27
SCHEDULE 5 ......................................................................................................................... 29
QUALITY OF SERVICE FOR VoLTE SERVICES ........................................................................... 29
SCHEDULE 6 ......................................................................................................................... 30
QUALITY OF SERVICE FOR INTERCONNECTION ...................................................................... 30
SCHEDULE 7 ......................................................................................................................... 31
QUALITY OF EXPERIENCE (NON-TECHNICAL PARAMETERS) ................................................... 31
ANNEXURE A ....................................................................................................................... 32
SITE CLASSIFICATIONS .................................................................................................................. 32

2
PART I - INTRODUCTION

The Botswana Communications Regulatory Authority (“BOCRA or the


Authority”) is a statutory body established under the Communications
Regulatory Authority Act of 2012 (The Act). The Authority is mandated to
apply the provisions of the Act in a manner which promote efficient provision
of communications services throughout the country. The Act is available from
Government Printers in Gaborone, Botswana or may be obtained at the
following website: http://www.bocra.org.bw.

In accordance with Section 6 of CRA Act, BOCRA is mandated to amongst


other things to undertake the following.

a. Protect and promote the interests of consumers particularly in


respect of the prices charged for, and the availability, quality and
variety of services and products and where appropriate, the
variety of services and products offered throughout Botswana,
such as will satisfy all reasonable demands for those services
and products; and

b. Monitor performance of regulated sectors in relation to the levels


of investment, availability, quantity, quality and standards of
services, competition, pricing, cost of services, efficiency of
production, and distribution of services and any other matters
decided upon by the Authority.

EXPLANATORY NOTE

1. The primary object of these Guidelines is to establish a framework


within which Telecommunication Service Providers can report on
network quality of service performance.
2. All Telecommunication Service Providers shall comply with the terms
of these guidelines.
3. The Guidelines are divided into Nine parts. A summary of each Part of
the proposed Guidelines is provided below.
4. Part I sets out the background to the Guidelines, objectives of the
Guidelines and provides definitions of technical and non-technical
terms used in the Guidelines.This part deals with the responsibility of
the Authority under the Communications Regulatory Act. The idea is to

3
show the legal basis for the work of the Authority and its legal right and
or power to enforce compliance to the set standards of Quality of
Service
5. Part II is dedicated to the abbreviation and definition of terms used in
these document
6. Part III gives the communication license obligation
7. Part IV details compliance to the guidelines
8. Part V discusses how the service providers should report, how they
should keep the performance records, how auditing will be carried out
and requirement for publication of performance
9. Part VI gives information in terms of reporting planned and unplanned
interruption of service
10. Part VII gives information on how frequently these guidelines will
be reviewed.
11. Part VIII this part describes the implementation/the come into
effect date
12. Part IX discusses the different schedules for the different Key
Performance Indicators for fixed (voice and data), mobile data (voice
and data), interconnection, Voice over LTE and parameters for no
technical key performance indicators which quality of experience from
consumers.

The review of this guidelines is intended to incorporate technological


changes, how service providers should report network interruption and how
performance should be reported across different areas and technologies.

These guidelines will be available for public comment until 30th of June 2022.
All comments, questions and clarity should be forwarded to:

Mr Tebogo Ketshabile at – ketshabile@bocra.org.bw; or


Ms Cynthia Jansen at jansen@bocra.org.bw

4
PRELIMINARY PROVISIONS

The Authority means Botswana Communications Regulatory Authority


(BOCRA).

In these Guidelines, unless the context requires otherwise, the Act means
the Communications Regulatory Authority Act (CRA Act) of 2012.

These Guidelines maybe referred to as the current Information and


Communications Technologies Quality of Service (ICT QoS) and Quality of
Experience (QoE) Guidelines of 2022.

These Guidelines shall apply to all Licensees offering Internet, and


Telecommunications services in Botswana.

The authority in consultation with industry stakeholder may from time to time
review the guidelines to cater for the ever-changing landscape of the
telecommunications industry and to align to the National Broadband
Strategy.

OBJECTIVES

The objectives of these Guidelines are to:

(a) Protect and to promote the interest of consumers of Internet and


Telecommunications services.
(b) Provide measurement options for quality of service.
(c) Monitor quality of experience; and
(d) Promote competition amongst the service providers

5
PART II

ABBREVIATIONS

5G - Fifth Generation Network


CSR - Call Success Rate
CSSR - Call Set Up Success Rate
CST - Call Setup Time
DCR - Drop Call Ratio
DCR - Drop Call Ratio
DNS - Domain Name System
DSR - Delivery Success Rate
FTP - File Transfer Protocol
GSM - Global System for Mobiles
HSR - Handover Success Rate
HTTP - Hypertext Transfer Protocol
ISSR - Internet Setup Success Rate
LTE - Long Term Evolution
MCS - Mobile Coverage Strength
MOS - Mean Opinion Score
MTTR - Mean Time To Repair
NA - Network Availability
NER - Network Efficiency Ratio
PDD - Post Dialing Delay
POI - Point Of Interconnection
QoE - Quality of Experience
QoS - Quality of Service
RSR - Registration Success Rate
RSR - Resolution Success Rate
SMS - Short Message Service
UMTS - Universal Mobile Telecommunications Service
WWW - World Wide Web

6
DEFINITIONS

Access Network Utilization - is the total traffic between access node to


aggregation node.

Billing Accuracy (BA) – is the same duration in seconds used for a call
should be used for charging.

Billing Complaint Rate (BCR) - is the percentage of customer billing related


complaints per the reporting period. Both the level of quality of service
offered to the consumer and the perceptions and/ or experience of the quality
of service offered are critical factors in monitoring quality of service.

Call Centre Operator Response (CCOR) - is the duration between sending


a request to speak to the Operator to the time that the Operator’s response
is heard.

Call Connection Failure (CCF) – is the percentage of unsuccessful calls.

Call Set-up Success Rate (CSSR) - is the ratio of total number of successful
calls to the total number of all call attempts made on the network during a
specified period.

Call Setup Time (CST)- is the duration from when a call is made to the time
of receiving a ring back tone.

Complaint Resolution Time (CRT) - is the time taken for a service provider
to resolve a complaint.

Data Access Success Rate - is the probability of success in connecting to


the public server.

Data Transmission Rate - is the speed of data travelling from user to a


network and back.

DNS Resolution Success Rate - is the likelihood for a domain name to be


converted into an IP address successfully by DNS resolver.
DNS Resolution Time – is the time taken for a DNS domain name to
translate website names into IP addresses.

7
DNS Resolver - also known as a resolver, is a server on the internet that is
responsible to convert domain names to IP addresses.

Drop Call Ratio (DCR) - is the percentage of calls connected to intended


recipients that ended without the intervention of any of the users.

FTP - is the standard network protocol used for computer file transfer
between a client and a server

FTP Drop Rate - is the percentage of incomplete data transfers that were
started successfully.

FTP Mean Data Rate [Mbit/s] - is the average data transfer rate measured
through the entire connect time to the service.

FTP Set-up Time - is the duration to access the service successfully, from
starting the dial-up connection to the point of time when the content is sent
or received.

Handover Success Rate (HSR) - is the ratio of successful handover calls


to the total number of handover call attempts made. Handover is the process
by which a mobile telephone call is transferred from one base station to
another as the subscriber passes the boundary of a cell. [Recommendation
ITU-T Q.1005].

HTTP - is the underlying protocol used by the world wide web that defines
how messages are formatted and transmitted and what actions the WEB
server and browser should take in response to various commands

HTTP Drop Rate – is the percentage of incomplete data transfers that were
started successfully.

HTTP Mean Data Rate - is the average data transfer rate measured through
the entire connection time to the service.

HTTP Set-up Time – is the duration between the instant when the request
of the web page is sent to the instant when the beginning of the web page is
received.

8
Interconnection Route Utilization (IRU) – is the percentage of provisioned
interconnection route(s) that carry traffic.

Latency – is the round-trip time taken by standard packet to travel across


network from end user to the test equipment and back to the user.

Mean Opinion Score (MOS) - is a numerical value that measures user


experience and the factors that influence voice quality. [Recommendation
ITU-T P.863].

Mean Time To Repair (MTTR -) - is the duration between a reported fault to


service restoration.

Mobile Coverage Strength - is the transmitter power output as received by


a reference antenna at a distance from the transmitting antenna.

Network Availability (NA) - is the degree to which the network is operable


and not in a state of failure or outage at any point in time.

Network Efficiency Ratio (NER) - is the ability of the network to deliver calls
to the furthest terminal. It expresses the relationship between the number of
seizures and the sum of number of seizures resulting in either an answer
message, or a user busy or a no answer ring. [Recommendation ITU-T
E.425]. [the ability of the network to deliver calls]

Packet Loss - is the percentage of data packets transmitted from the source
but fail to arrive at their destinations.

Point of Interconnection Congestion - is the percentage of congestion at


point of interconnection. [Recommendation ITU-T E847].

Post Dialing Delay (PDD) - is the Time interval in seconds between the end
of dialing by the caller and the reception of the network response. Equivalent
to Call Setup Time, as defined in [Recommendation ITU-T E.800].

Provision of Service - is the time taken to provide service to a location


where it is required.

Quality of Experience (QoE) - is the consumer perception, or experience


of the quality of the service offered.

9
Quality of Service (QoS) - is the statement of the level of quality of the
service as offered to the consumer by a service provider. [ITU-T
Recommendation G.100].

Quality of Service Guidelines - is a set of standards and measures that


define applicable quality measures.

Registration Success Rate (RSR) - is the ratio of the number of successful


established terminating sessions to the number of attempted established
terminating sessions. [ETSI TR 103 219].

Service Availability (SA -) - is the percentage of time the network shall be


available to the subscribers.

SMS Delivery Success Rate - is the percentage of sent messages that are
received by the intended recipient(s).

SMS End to End Delivery Time - is the duration from when an SMS is sent
to the time of receiving the SMS by the intended recipient(s).

SMS Service Accessibility - is the probability that a user can access the
SMS centre for sending SMS.

Throughput - is the speed of uploading and downloading data in Megabits


per second between end-user and test equipment.

Web Radio Reproduction Cut-off Ratio -is the percentage that a


subscriber cannot successfully complete stream reproduction from a given
web radio station for a given period.
Web Radio Tune-in Success Rate - is the percentage that a subscriber can
obtain the tune-in information for a web radio streaming server successfully.

Web Radio Tune-in Success Time – is the duration needed to obtain the
tune-in information for a web radio streaming server successfully.
A. TECHNICAL PARAMETERS

FIXED VOICE SERVICES

10
The following parameters as defined under definitions are applicable to fixed
services: -

Call Setup Success Rate (CSSR)

Call Setup Time (CST)

Drop Call Ratio (DCR)

Network Availability NA)

MOBILE VOICE SERVICES

The following parameters as defined under definitions are applicable to


mobile services: -

Mean Opinion Score (MOS)

Call Setup Time (CST)

Call Set-up Success Rate (CSSR)

Drop Call Ratio (DCR)

Handover Success Rate (HSR)

Mobile Coverage Strength (MCS)

Network Availability (NA)

SMS Delivery Success Rate

SMS End to End Delivery Time

SMS Service Accessibility


Network Efficiency Ratio (NER)

Post Dialing Delay (PDD)

11
Registration Success Rate (RSR)

Service Availability (SA)

FIXED INTERNET SERVICES (WIRED AND WIRELESS)

The following parameters as defined under definitions are applicable to fixed


internet services both wired and wireless. Copper and fiber connection are
considered as fixed connections: -

Call Setup Success Rate (CSSR)

Call Session Drop Rate

Call Setup Time

DNS Resolution Success Rate

DNS Resolution Time

Data Transmission Rate

Access Network Utilization

Throughput

Latency

Packet Loss

MOBILE INTERNET SERVICES

The following parameters as defined under definitions are applicable to


mobile internet services. Where applicable, different KPI target will be set for
different mobile technologies as different technologies have different
capabilities: -

Call Setup Success Rate (CSSR)

12
Call Session Drop Rate

Call Setup Time

FTP Drop Rate

FTP Mean Data Rate [Mbit/s]

FTP Set-up Time

HTTP Drop Rate

HTTP Mean Data Rate

HTTP Set-up Time

Average user Throughput (Download and Upload)

Access Network Utilization

Latency

WEB RADIO STREAMING SERVICE

The following parameters as defined under definitions are applicable to web


radio streaming services: -

Web Radio Tune-in Success Rate

Web Radio Tune-in Success Time

Web Radio Reproduction Cut-off Ratio

INTERCONNECTION

The following parameters as defined under definitions are applicable to


interconnection:

13
Interconnection Route Utilization

Point of Interconnection Congestion

B. NON-TECHNICAL PARAMETERS

The following parameters as defined under definitions are applicable to non-


Technical Services

Service Availability

Provision of Service

Call Centre Operator Response

Mean Time To Repair (MTTR)

Billing Complaint Rate

Billing Accuracy

Complaint Resolution Time (Technical complaints)

14
PART III

COMMUNICATION SERVICE LICENSEE OBLIGATIONS

The ICT service providers shall: -

(a) provide communication services that meet quality of service


parameters as set forth by these Guidelines.

(b) support the intervention of the regulator by allowing access to the


network for purposes of collecting network performance data
when requested.

(c) continuously measure network performance and keep records of


the results of the measurements as per Part V; and

(d) report the same as per Part V of these guidelines.

PART IV

COMPLIANCE

Section 6 (2) (a and c) of the CRA Act, mandates the Authority to: -

“(a) protect and promote the interests of consumers, purchasers and other
users of the services in the regulated sectors, particularly in respect of the
prices charged for, and the availability, quality and variety of services and
products, and where appropriate, the variety of services and products offered
throughout Botswana, such as will satisfy all reasonable demands for
those services and products;

and

(c) monitor the performance of the regulated sectors in relation to levels of


investment, availability, quantity, quality and standards of services,

15
competition, pricing, the costs of services, the efficiency of production and
distribution of services and any other matters decided upon by the
Authority;”.

Service Level Agreements

The Service Providers shall establish well-defined Service Level Agreements


(SLAs) with consumers to ensure end to end QoS.

In general, the SLA shall state, among others, the following: -

 Level of performance: The minimum level of service performance


offered to the customer, not the average level to be achieved for all
customers.
 The compensation: if the minimum service level is not achieved, the
compensation should at least be commensurate to the degree of
failure; and
 The mechanism for claiming compensation: this should be done
automatically without requiring the customer to file for a claim.

Monitoring

The Authority shall: -

(a) carry out network monitoring, and validate the data against network
performance data from the operators
(b) monitor the adherence to Quality-of-Service measurements
procedures; and
(c) direct its officers or agents (third party) to carry out investigations
on Quality-of-Service measurements.

Inspection

The Authority or any person authorized in writing by the Authority may, upon
furnishing reasonable notice, enter upon the premises of the licensee and
request access to the network management servers for purposes of
ascertaining compliance with these Guidelines.

16
Enforcement

The Authority shall take appropriate measures to enforce these Guidelines


in conjunction with penalties as stipulated in the BOCRA Penalty Framework.

PART V

REPORTING

Reporting

The Licensee shall, submit Quality of Service reports as may be required by


the Authority from time to time.

Submit to BOCRA network performance raw data for the purpose of analysis
and reports generation.

Record Keeping

All Licensed Service Providers shall:

(a) maintain documented processes of data collection as per the KPI


Formulas for each parameter contained in these Guidelines and
submit the same to the Authority as required from time to time.

(b) complete and maintain accurate records of its compliance for each
QoS parameter specified in such a manner and in such a format,
as may be prescribed by the Authority from time to time.

(c) The Authority may, from time to time, either by order or by


direction, specify uniform record keeping procedures and formats
including guidelines on measurement methodology for various
QoS parameters; and

17
(d) The Authority may, if it considers it expedient to do so, at any time,
direct any of its officers or employees or an agency appointed by
the Authority to inspect the records or to get such records audited.

Auditing

The Authority shall:

(a) audit some or all the Quality-of-Service data.


(b) opt to use a third party to perform audits on behalf of the Authority.
(c) audit Quality of Experience based on Customer Satisfaction
Surveys undertaken by the Authority; and
(d) vary the frequency of the audits, reporting areas and reporting
periods that require auditing.

Publication

The Authority shall publish, on print media or on the Authority’s website or


any applicable digital platform, the Quality of Service/ Quality of Experience
Report as may be decided by the Authority as follows:

(a) the compliance reports of each Quality of Service and Quality of


Experience parameter reported/submitted to it by the service
providers under these Guidelines.
(b) the results of the audit and assessment of the Quality of Service
and Quality of Experience undertaken by the Authority or its
authorized agent.
(c) the Authority may request licensees to publish Quality of Service
and Quality of Experience parameter information on their
websites, or any digital platforms; and
(d) If so, requested in terms of (c) above, Operators shall publish on
their websites, or any applicable digital platform, a Coverage Map
showing their network coverage and network availability.

PART VI

18
SERVICE INTERRUPTIONS

Notwithstanding that no subscriber should experience service interruption


and not be accorded prompt response, Interruptions affecting at least 5% of
the operator’s subscriber base, or affect site(s) and disconnecting
communities from network, MUST be considered critical and reported as
prescribed below.
Planned Service Interruptions

Licensees shall:

(a) issue public notices in advance of any planned interruption of


services by publishing such notice in a widely read electronic
media or print media at least 48 hours before the planned
interruption of service and send the notice through the Short
Messaging System (SMS), any applicable digital platform.
(b) issue public notices stating the number and type of subscribers that
will be affected by the planned interruptions; and
(c) provide the information for such service interruptions to the
Authority at least 48 hours before the planned interruption of
service.
Unplanned Service Interruptions

In the event of any unplanned service interruption a licensee shall:

(a) within an hour of the event, notify the Authority via email of the
occurrence of the event, including details on areas affected and
numbers of end users affected where possible.
(b) continue to provide updates to the Authority via email every one-
hour detailing progress in resolving the issue; and
(c) within 24 hours of resolution of the issue, provide the Authority with
a formal report detailing the circumstances attributing to the
interruptions of the service, and the action taken to remedy the
situation.
(d) send the public notice through the Short Messaging System (SMS)
or through any available digital platform.

19
PART VII

REVIEW

The Authority may review the Quality of Service and Quality of Experience
targets and parameters under these guidelines as and when required.

PART VIII

IMPLEMENTATION

These Guidelines shall come into effect on the 1st of September 2022.

SCHEDULES

The licensee providing services above shall be required to meet targets on


Quality-of-Service parameters as specified in Schedule 1 to 7 of these
Guidelines

20
SCHEDULE 1

QUALITY OF SERVICE PUBLIC SWITCHED TELEPHONE SERVICES

TABLE 1: QUALITY OF SERVICE PUBLIC SWITCHED TELEPHONE SERVICES


Parameter Name Formula Measurement Measurement Target
Mechanism Tool
(standards)
Network Availability Network Availability = [(Total Test traffic. ETSI EG Performance >99%
Operational minutes - Total minutes of 202 057-3 Monitoring System
service downtime) / Total operational
mi nutes]) * 100% Test Stations .

Call Set-up Time Call Set-up Time = Time of Call Test traffic Performance <3sec (local
Alerting - Time of receiving Dial tone Monitoring System call)
<5sec (Toll)
Test Stations
Call Setup Success Call Set-up Success Rate = (Total Test Traffic Performance <2%
Rate number of successfully connected Monitoring System
calls / Total number of attempts)
*100 % Test Stations
Drop Call Ratio Drop Call Ratio = (Number of Calls Real Traffic from OSS Performance ≤ 2%
disconnected without intervention by and or Test traffic. Monitoring System
any user / Number of Calls connected ETSI ES 202 765-2,
to intended recipient) *100% clause 7.4 Test Stations
Voice Quality Mean Opinion Score is expressed in Test traffic Performance ≥3.5
one number from 1-5, 1 being the Monitoring System
worst and 5 being the best.
Recommendation ITU-T P.800, Test Stations
ITU-T P.862 and ITU-T P.863.1

21
SCHEDULE 2

QUALITY OF SERVICE PARAMETR FOR MOBILE SERVICES

TABLE 2: QUALITY OF SERVICE PARAMETR FOR MOBILE SERVICES

Parameter Name Formula Measurement Measurement Tool Target


Mechanism

Network Availability Network Availability = [(Total Test traffic. Performance Monitoring >99% for class 1 locations
Operational minutes - Total System
minutes of service >98% for class 2 locations
downtime) / Total Test Stations,
operational minutes]) * 100% >97% for class 3 locations
Crowdsourcing systems

Call Set-up Time Call Set-up Time = Time of Test traffic Performance Monitoring <5 sec for GSM
Call Alerting - Time of System <4 sec for UMTS (intra network normal
receiving Dial tone traffic)
Test Stations <8 sec (mobile to fixed to normal traffic)
Drop Call Rate Drop Call Ratio = (Number Real Traffic from Performance Monitoring ≤ 2%
of Calls disconnected OSS and or Test system
without intervention by any traffic.
user / Number of Calls Test Stations
connected to intended
recipient) *100%

22
Call Set-up Success Call Set-up Success Rate = Real Traffic from Performance Monitoring ≥98% for all calls
Rate (Total number of OSS and or Test system
successfully connected calls traffic.
/ Total number of attempts) Test Stations
*100%

Handover Successful Handover Successful Rate = from OSS and or Performance Monitoring ≥96%
Rate (Total number of Successful Test traffic system
handovers / Total number of
handover requests) *100% Test Stations

Mobile Service Mobile Service Coverage Field strength Test Stations GSM
Coverage Signal signal strength = Field measurements ≥ -85dBm ((in- vehicles)
Strength strength measurements ≥ -95dBm (outdoors)

UMTS
≥ -90dBm ((in- vehicles)
≥ -100dBm (outdoors)

SMS Delivery SMS Delivery Success Rate Real Traffic from System/ Test Stations SMS 97% (Excluding Bulk SMS services)
Success Rate = (Number of SMS received OSS and or Test
by intended recipients/ Traffic
number of SMS sent) *100%
SMS End to End SMS End to End Delivery Test traffic Test Stations SMS should be delivered in less than <5
Delivery Time Time = Time SMS received – seconds (Excluding Bulk SMS services)
time SMS sent
SMS Service SMS Service Accessibility = Test traffic Performance Monitoring ≥ 97%
Accessibility (Success access to SMS system
centre /over total Number of
SMS attempts) * 100% Test Stations

Mean Opinion Score Mean Opinion Score is Test traffic Test Stations ≥3.0
(MOS) expressed in one number
from 1-5, 1 being the worst
and 5 being the best.

23
SCHEDULE 3

QUALITY OF SERVICE PARAMETERS FOR FIXED INTERNET SERVICES

TABLE 3: QUALITY OF SERVICE PARAMETERS FOR FIXED INTERNET SERVICES


Parameter Name Formula Measurement Measurement Target
Mechanism Tool

DNS Host Name DNS Host Name Resolution Real Traffic from Performance < 10 ms
Resolution Time Time = Time for standard query OSS and or Test Monitoring System
response received – time traffic
standard query sent. Test Stations
[ETSI TS 102 250–2 & ITU-T
Y.1540]

DNS Host Name DNS Host Name Resolution Real Traffic from Performance < 99%
Resolution Success Success Rate = (Successful OSS and or Test Monitoring System
Rate DNS host Name resolution traffic
requests/ Total DNS Host name Test Stations
resolution requests) *100 %
[ETSI TS 102 250-2]

24
Data transmission Rate Data transmission rate Test Traffic Performance At least 75% of the
Monitoring System advertised speed
= Size of test file/ The during peak time
transmission time required for a Test Stations
complete and error free
transmission

Access Network Access Network Utilization Test Traffic Performance uplink utilization
Utilization Monitoring System must not be more
= Total traffic between access than 75% of uplink
node / aggregation of traffic at Test Stations bandwidth provided
the node

Throughput Throughput = Number of test Test Traffic Performance Throughput must


samples greater than or equals Monitoring System not be less than:
QoS throughput) /Total number a) 75% of
of test samples)) *100% Test Stations subscribed level of
bandwidth for 90%
of the time for
Contended Fixed
Connections
(xDSL)

b) 90% of
subscribed level of
bandwidth for 90%
of the time for
Contended Fixed
Connections
(Fiber).

c) 95% of the
subscribed
bandwidth for 100
% of the time for
Dedicated services
(ALL).

25
Latency Latency = (Number of test Test Traffic Performance ≤ 85 ms 95% of the
samples less than or equal to 85 Monitoring System time
ms /Total number of test
samples) *100% Test Stations

Packet Loss Packet Loss = (Total no of packet Test Traffic Performance ≤ 1%


lost / Total no of packets sent) Monitoring System
*100%
Test Stations

26
SCHEDULE 4

QUALITY OF SERVICE PARAMETERS FOR MOBILE INTERNET SERVICES


TABLE 4: QUALITY OF SERVICE PARAMETERS FOR MOBILE INTERNET SERVICES
Parameter Name Formula Measurement Measurement Tool Target
Mechanism
Call Session Set-up Call Session Set-up Time = Time Real Traffic from OSS Performance Monitoring 100% within 5 seconds
Time Content Received-Time Content and or Test traffic System
requested
Test Stations
Call Session Success ETSI TS 102 250-2 Performance Monitoring >98%
Rate CISSR) System

Test Stations
Call Session Drop Call Session Drop Rate = (Number of Real Traffic from OSS Performance Monitoring < 2%
Rate incomplete data transfers/ Number of and or Test traffic System
transfers started successfully) *100%
Test Stations < 1% (for 5G)
Call Session Mean data Rate = User Real Traffic from OSS Performance Monitoring UMTS
Average User data transferred (Kbit) /(Time Data and or Test traffic= user System >5 Mbps
throughput transfer Complete-Time Data Transfer data transferred [Kbits]/
Start) Time data transfer is Test Stations LTE
completed-time data >15 Mbps (2022-2023)
transfer start >25 Mbps (2023-2024)
>35 Mbps (2024+ )

5G
>100Mbps (202/23)
>150Mbps (2023/24)
>200 Mbps (2024+)

27
FTP {download FTP {download |upload} Set-up Time = Real Traffic from OSS Performance Monitoring < 2 seconds
|upload} Set-up Time Time Service Access Successful -Time and or Test traffic System
Service Access Start
Test Stations

FTP Drop Rate FTP Drop Rate = (Number of Real Traffic from OSS Performance Monitoring < 1%
incomplete data transfers/ Number of and or Test traffic System
transfers started successfully) *100%
Test Stations

FTP {download| FTP {download| upload} Mean Data Real Traffic from OSS Performance Monitoring UMTS
upload} Mean Data Rate [Mbit/s] = User data transferred and or Test traffic System 2Mbps
Rate [Mbit/s] (Mbits) /(Time Data transfer Complete-
Time Data Transfer Start) Test Stations LTE
10Mbps

5G
75Mbps
FTP {download| FTP {download| upload} data transfer Real Traffic from OSS Performance Monitoring 95%
upload} data transfer success ratio [%] = (completed data and or Test traffic System
success ratio [%] transfers /successfully started data
transfers) *100% Test Stations
Web Radio Tune-in Web Radio Tune-in Success Rate = Test traffic Test Stations >98%
Success Rate (Number of Successful tune-in/ Total
attempts) * 100%
Web Radio Tune-in Web Radio Tune-in Success Time = Test Traffic Test Stations < 2 seconds
Success Time Time attempt Tune-in - Time
Successful Tune-in
Web Radio Web Radio Reproduction Cut-off Ratio Test Traffic Test Stations < 2%
Reproduction Cut-off = (Number of Unsuccessful listening
Ratio attempts/ Total attempts) * 100%
Data Packets Test Traffic Performance Monitoring <100ms for Local IXP
Latency System <300ms for
International IXP
Test Stations
5G
Crowdsourcing <50ms for Local IXP
<100ms for
International IXP

28
SCHEDULE 5

QUALITY OF SERVICE FOR VoLTE SERVICES

TABLE 5: QUALITY OF SERVICE FOR VoLTE SERVICES


Parameter Name Formula Measurement Measurement Tool Target
Mechanism

Registration Success Registration success rate = Test traffic Performance Monitoring ≤ 98%
Rate (Successful Registration attempts/ System
Total number of Registration
attempts) *100 (ETSI TR 103 219) Test Stations

Post Dialing Delay Post Dialing Delay (PDD) = Time of Test traffic Performance Monitoring ≤4s
(PDD) ringing tone - time of dialing. System

Test Stations

Drop Call Rate Drop Call Rate = (Total number of Test traffic Performance Monitoring ≤2%
calls terminated unwillingly/ total System
number of successfully established
calls) *100% Test Stations
ITU-T Recommendation E.804
(Section7.3.6.5(
Network Efficiency Network Efficiency Ratio = Number Test traffic Performance Monitoring ≥ 95%
Ratio of seizures resulting in answer System
message, user busy, no answer /
Total number of seizures attempt) Test Stations
*100%

29
SCHEDULE 6

QUALITY OF SERVICE FOR INTERCONNECTION


TABLE 6: QUALITY OF SERVICE FOR INTERCONNECTION
Parameter Name Formula Measurement Measurement Tool Target
Mechanism
Interconnection Route Interconnection Route Real Traffic Performance monitoring < 80%
Utilization Utilization = Capacity in system
use / Capacity
Provisioned) *100%
Point of interconnection Point of interconnection Real traffic from Performance Monitoring <0.5%
Congestion Congestion = (Number of OSS and or test system
blocked call attempts /total Traffic (ITU-T
number of call attempts) * Recommendation Test stations
100% E.847-201703)

30
SCHEDULE 7

QUALITY OF EXPERIENCE (NON-TECHNICAL PARAMETERS)

TABLE 7: QUALITY OF EXPERIENCE (NON-TECHNICAL PARAMETERS)


Parameter Name Formula Measurement Measurement Target
Mechanism Tool

Service Availability Service Availability = [(Total Operational Test traffic Performance ≥ 98%
minutes - Total minutes of service downtime) Monitoring System
/ Total operational minutes] x 100%
Test Stations

Consumer >90%
satisfaction survey
Provision of Service Provision of Service = the time the customer Complaints Trouble ticket system 5 business Days
pays for service to the time the customer is
provided with service

Call Centre Operator Call Centre Operator Response = Time Test traffic Test Stations < 30 seconds
Response Operator Assistance Pick up - Time Making
Operator request.

Mean Time To Repair (MTTR) Mean Time To Repair (MTTR) = Time Service Complaints Trouble ticket system MOBILE
Restored- Time Reported Class 1 locations: 2 hrs
Class 2 locations :8 hrs.
Class 3 locations 24 hrs
FIXED:
< 5hours
INTERCONNECTION
<1 hrs

31
ANNEXURE A

SITE CLASSIFICATIONS

The following shall describe the population size per the reporting category

Class 1: Serving a location/locality with population of more than 5,000

Class 2: Serving a location/locality with population between 750 and 5,000

Class 3: Serving a location/locality with population of less than 750

32

You might also like