Daft QoS QoE-Guidelines (Rev1407)
Daft QoS QoE-Guidelines (Rev1407)
Daft QoS QoE-Guidelines (Rev1407)
TECHNOLOGIES QUALITY OF
SERVICE AND
QUALITY OF EXPERIENCE
GUIDELINES
0
Table of Contents
PART I ................................................................................................................................... 3
PRELIMINARY PROVISIONS ....................................................................................................5
OBJECTIVES .....................................................................................................................................................5
PART II .................................................................................................................................. 6
DEFINITIONS ..............................................................................................................................7
A. TECHNICAL PARAMETERS ...............................................................................................................10
B. NON-TECHNICAL PARAMETERS .....................................................................................................14
PART V ................................................................................................................................ 17
REPORTING .............................................................................................................................. 17
Reporting ..........................................................................................................................................................17
Record Keeping ...............................................................................................................................................17
Auditing .............................................................................................................................................................18
Publication ........................................................................................................................................................18
PART VI ............................................................................................................................... 18
SERVICE INTERRUPTIONS .................................................................................................... 19
Planned Service Interruptions .......................................................................................................................19
Unplanned Service Interruptions ...................................................................................................................19
1
QUALITY OF SERVICE PARAMETERS FOR FIXED INTERNET SERVICES ...................................... 24
SCHEDULE 4 ......................................................................................................................... 27
QUALITY OF SERVICE PARAMETERS FOR MOBILE INTERNET SERVICES .................................. 27
SCHEDULE 5 ......................................................................................................................... 29
QUALITY OF SERVICE FOR VoLTE SERVICES ........................................................................... 29
SCHEDULE 6 ......................................................................................................................... 30
QUALITY OF SERVICE FOR INTERCONNECTION ...................................................................... 30
SCHEDULE 7 ......................................................................................................................... 31
QUALITY OF EXPERIENCE (NON-TECHNICAL PARAMETERS) ................................................... 31
ANNEXURE A ....................................................................................................................... 32
SITE CLASSIFICATIONS .................................................................................................................. 32
2
PART I - INTRODUCTION
EXPLANATORY NOTE
3
show the legal basis for the work of the Authority and its legal right and
or power to enforce compliance to the set standards of Quality of
Service
5. Part II is dedicated to the abbreviation and definition of terms used in
these document
6. Part III gives the communication license obligation
7. Part IV details compliance to the guidelines
8. Part V discusses how the service providers should report, how they
should keep the performance records, how auditing will be carried out
and requirement for publication of performance
9. Part VI gives information in terms of reporting planned and unplanned
interruption of service
10. Part VII gives information on how frequently these guidelines will
be reviewed.
11. Part VIII this part describes the implementation/the come into
effect date
12. Part IX discusses the different schedules for the different Key
Performance Indicators for fixed (voice and data), mobile data (voice
and data), interconnection, Voice over LTE and parameters for no
technical key performance indicators which quality of experience from
consumers.
These guidelines will be available for public comment until 30th of June 2022.
All comments, questions and clarity should be forwarded to:
4
PRELIMINARY PROVISIONS
In these Guidelines, unless the context requires otherwise, the Act means
the Communications Regulatory Authority Act (CRA Act) of 2012.
The authority in consultation with industry stakeholder may from time to time
review the guidelines to cater for the ever-changing landscape of the
telecommunications industry and to align to the National Broadband
Strategy.
OBJECTIVES
5
PART II
ABBREVIATIONS
6
DEFINITIONS
Billing Accuracy (BA) – is the same duration in seconds used for a call
should be used for charging.
Call Set-up Success Rate (CSSR) - is the ratio of total number of successful
calls to the total number of all call attempts made on the network during a
specified period.
Call Setup Time (CST)- is the duration from when a call is made to the time
of receiving a ring back tone.
Complaint Resolution Time (CRT) - is the time taken for a service provider
to resolve a complaint.
7
DNS Resolver - also known as a resolver, is a server on the internet that is
responsible to convert domain names to IP addresses.
FTP - is the standard network protocol used for computer file transfer
between a client and a server
FTP Drop Rate - is the percentage of incomplete data transfers that were
started successfully.
FTP Mean Data Rate [Mbit/s] - is the average data transfer rate measured
through the entire connect time to the service.
FTP Set-up Time - is the duration to access the service successfully, from
starting the dial-up connection to the point of time when the content is sent
or received.
HTTP - is the underlying protocol used by the world wide web that defines
how messages are formatted and transmitted and what actions the WEB
server and browser should take in response to various commands
HTTP Drop Rate – is the percentage of incomplete data transfers that were
started successfully.
HTTP Mean Data Rate - is the average data transfer rate measured through
the entire connection time to the service.
HTTP Set-up Time – is the duration between the instant when the request
of the web page is sent to the instant when the beginning of the web page is
received.
8
Interconnection Route Utilization (IRU) – is the percentage of provisioned
interconnection route(s) that carry traffic.
Network Efficiency Ratio (NER) - is the ability of the network to deliver calls
to the furthest terminal. It expresses the relationship between the number of
seizures and the sum of number of seizures resulting in either an answer
message, or a user busy or a no answer ring. [Recommendation ITU-T
E.425]. [the ability of the network to deliver calls]
Packet Loss - is the percentage of data packets transmitted from the source
but fail to arrive at their destinations.
Post Dialing Delay (PDD) - is the Time interval in seconds between the end
of dialing by the caller and the reception of the network response. Equivalent
to Call Setup Time, as defined in [Recommendation ITU-T E.800].
9
Quality of Service (QoS) - is the statement of the level of quality of the
service as offered to the consumer by a service provider. [ITU-T
Recommendation G.100].
SMS Delivery Success Rate - is the percentage of sent messages that are
received by the intended recipient(s).
SMS End to End Delivery Time - is the duration from when an SMS is sent
to the time of receiving the SMS by the intended recipient(s).
SMS Service Accessibility - is the probability that a user can access the
SMS centre for sending SMS.
Web Radio Tune-in Success Time – is the duration needed to obtain the
tune-in information for a web radio streaming server successfully.
A. TECHNICAL PARAMETERS
10
The following parameters as defined under definitions are applicable to fixed
services: -
11
Registration Success Rate (RSR)
Throughput
Latency
Packet Loss
12
Call Session Drop Rate
Latency
INTERCONNECTION
13
Interconnection Route Utilization
B. NON-TECHNICAL PARAMETERS
Service Availability
Provision of Service
Billing Accuracy
14
PART III
PART IV
COMPLIANCE
Section 6 (2) (a and c) of the CRA Act, mandates the Authority to: -
“(a) protect and promote the interests of consumers, purchasers and other
users of the services in the regulated sectors, particularly in respect of the
prices charged for, and the availability, quality and variety of services and
products, and where appropriate, the variety of services and products offered
throughout Botswana, such as will satisfy all reasonable demands for
those services and products;
and
15
competition, pricing, the costs of services, the efficiency of production and
distribution of services and any other matters decided upon by the
Authority;”.
Monitoring
(a) carry out network monitoring, and validate the data against network
performance data from the operators
(b) monitor the adherence to Quality-of-Service measurements
procedures; and
(c) direct its officers or agents (third party) to carry out investigations
on Quality-of-Service measurements.
Inspection
The Authority or any person authorized in writing by the Authority may, upon
furnishing reasonable notice, enter upon the premises of the licensee and
request access to the network management servers for purposes of
ascertaining compliance with these Guidelines.
16
Enforcement
PART V
REPORTING
Reporting
Submit to BOCRA network performance raw data for the purpose of analysis
and reports generation.
Record Keeping
(b) complete and maintain accurate records of its compliance for each
QoS parameter specified in such a manner and in such a format,
as may be prescribed by the Authority from time to time.
17
(d) The Authority may, if it considers it expedient to do so, at any time,
direct any of its officers or employees or an agency appointed by
the Authority to inspect the records or to get such records audited.
Auditing
Publication
PART VI
18
SERVICE INTERRUPTIONS
Licensees shall:
(a) within an hour of the event, notify the Authority via email of the
occurrence of the event, including details on areas affected and
numbers of end users affected where possible.
(b) continue to provide updates to the Authority via email every one-
hour detailing progress in resolving the issue; and
(c) within 24 hours of resolution of the issue, provide the Authority with
a formal report detailing the circumstances attributing to the
interruptions of the service, and the action taken to remedy the
situation.
(d) send the public notice through the Short Messaging System (SMS)
or through any available digital platform.
19
PART VII
REVIEW
The Authority may review the Quality of Service and Quality of Experience
targets and parameters under these guidelines as and when required.
PART VIII
IMPLEMENTATION
These Guidelines shall come into effect on the 1st of September 2022.
SCHEDULES
20
SCHEDULE 1
Call Set-up Time Call Set-up Time = Time of Call Test traffic Performance <3sec (local
Alerting - Time of receiving Dial tone Monitoring System call)
<5sec (Toll)
Test Stations
Call Setup Success Call Set-up Success Rate = (Total Test Traffic Performance <2%
Rate number of successfully connected Monitoring System
calls / Total number of attempts)
*100 % Test Stations
Drop Call Ratio Drop Call Ratio = (Number of Calls Real Traffic from OSS Performance ≤ 2%
disconnected without intervention by and or Test traffic. Monitoring System
any user / Number of Calls connected ETSI ES 202 765-2,
to intended recipient) *100% clause 7.4 Test Stations
Voice Quality Mean Opinion Score is expressed in Test traffic Performance ≥3.5
one number from 1-5, 1 being the Monitoring System
worst and 5 being the best.
Recommendation ITU-T P.800, Test Stations
ITU-T P.862 and ITU-T P.863.1
21
SCHEDULE 2
Network Availability Network Availability = [(Total Test traffic. Performance Monitoring >99% for class 1 locations
Operational minutes - Total System
minutes of service >98% for class 2 locations
downtime) / Total Test Stations,
operational minutes]) * 100% >97% for class 3 locations
Crowdsourcing systems
Call Set-up Time Call Set-up Time = Time of Test traffic Performance Monitoring <5 sec for GSM
Call Alerting - Time of System <4 sec for UMTS (intra network normal
receiving Dial tone traffic)
Test Stations <8 sec (mobile to fixed to normal traffic)
Drop Call Rate Drop Call Ratio = (Number Real Traffic from Performance Monitoring ≤ 2%
of Calls disconnected OSS and or Test system
without intervention by any traffic.
user / Number of Calls Test Stations
connected to intended
recipient) *100%
22
Call Set-up Success Call Set-up Success Rate = Real Traffic from Performance Monitoring ≥98% for all calls
Rate (Total number of OSS and or Test system
successfully connected calls traffic.
/ Total number of attempts) Test Stations
*100%
Handover Successful Handover Successful Rate = from OSS and or Performance Monitoring ≥96%
Rate (Total number of Successful Test traffic system
handovers / Total number of
handover requests) *100% Test Stations
Mobile Service Mobile Service Coverage Field strength Test Stations GSM
Coverage Signal signal strength = Field measurements ≥ -85dBm ((in- vehicles)
Strength strength measurements ≥ -95dBm (outdoors)
UMTS
≥ -90dBm ((in- vehicles)
≥ -100dBm (outdoors)
SMS Delivery SMS Delivery Success Rate Real Traffic from System/ Test Stations SMS 97% (Excluding Bulk SMS services)
Success Rate = (Number of SMS received OSS and or Test
by intended recipients/ Traffic
number of SMS sent) *100%
SMS End to End SMS End to End Delivery Test traffic Test Stations SMS should be delivered in less than <5
Delivery Time Time = Time SMS received – seconds (Excluding Bulk SMS services)
time SMS sent
SMS Service SMS Service Accessibility = Test traffic Performance Monitoring ≥ 97%
Accessibility (Success access to SMS system
centre /over total Number of
SMS attempts) * 100% Test Stations
Mean Opinion Score Mean Opinion Score is Test traffic Test Stations ≥3.0
(MOS) expressed in one number
from 1-5, 1 being the worst
and 5 being the best.
23
SCHEDULE 3
DNS Host Name DNS Host Name Resolution Real Traffic from Performance < 10 ms
Resolution Time Time = Time for standard query OSS and or Test Monitoring System
response received – time traffic
standard query sent. Test Stations
[ETSI TS 102 250–2 & ITU-T
Y.1540]
DNS Host Name DNS Host Name Resolution Real Traffic from Performance < 99%
Resolution Success Success Rate = (Successful OSS and or Test Monitoring System
Rate DNS host Name resolution traffic
requests/ Total DNS Host name Test Stations
resolution requests) *100 %
[ETSI TS 102 250-2]
24
Data transmission Rate Data transmission rate Test Traffic Performance At least 75% of the
Monitoring System advertised speed
= Size of test file/ The during peak time
transmission time required for a Test Stations
complete and error free
transmission
Access Network Access Network Utilization Test Traffic Performance uplink utilization
Utilization Monitoring System must not be more
= Total traffic between access than 75% of uplink
node / aggregation of traffic at Test Stations bandwidth provided
the node
b) 90% of
subscribed level of
bandwidth for 90%
of the time for
Contended Fixed
Connections
(Fiber).
c) 95% of the
subscribed
bandwidth for 100
% of the time for
Dedicated services
(ALL).
25
Latency Latency = (Number of test Test Traffic Performance ≤ 85 ms 95% of the
samples less than or equal to 85 Monitoring System time
ms /Total number of test
samples) *100% Test Stations
26
SCHEDULE 4
Test Stations
Call Session Drop Call Session Drop Rate = (Number of Real Traffic from OSS Performance Monitoring < 2%
Rate incomplete data transfers/ Number of and or Test traffic System
transfers started successfully) *100%
Test Stations < 1% (for 5G)
Call Session Mean data Rate = User Real Traffic from OSS Performance Monitoring UMTS
Average User data transferred (Kbit) /(Time Data and or Test traffic= user System >5 Mbps
throughput transfer Complete-Time Data Transfer data transferred [Kbits]/
Start) Time data transfer is Test Stations LTE
completed-time data >15 Mbps (2022-2023)
transfer start >25 Mbps (2023-2024)
>35 Mbps (2024+ )
5G
>100Mbps (202/23)
>150Mbps (2023/24)
>200 Mbps (2024+)
27
FTP {download FTP {download |upload} Set-up Time = Real Traffic from OSS Performance Monitoring < 2 seconds
|upload} Set-up Time Time Service Access Successful -Time and or Test traffic System
Service Access Start
Test Stations
FTP Drop Rate FTP Drop Rate = (Number of Real Traffic from OSS Performance Monitoring < 1%
incomplete data transfers/ Number of and or Test traffic System
transfers started successfully) *100%
Test Stations
FTP {download| FTP {download| upload} Mean Data Real Traffic from OSS Performance Monitoring UMTS
upload} Mean Data Rate [Mbit/s] = User data transferred and or Test traffic System 2Mbps
Rate [Mbit/s] (Mbits) /(Time Data transfer Complete-
Time Data Transfer Start) Test Stations LTE
10Mbps
5G
75Mbps
FTP {download| FTP {download| upload} data transfer Real Traffic from OSS Performance Monitoring 95%
upload} data transfer success ratio [%] = (completed data and or Test traffic System
success ratio [%] transfers /successfully started data
transfers) *100% Test Stations
Web Radio Tune-in Web Radio Tune-in Success Rate = Test traffic Test Stations >98%
Success Rate (Number of Successful tune-in/ Total
attempts) * 100%
Web Radio Tune-in Web Radio Tune-in Success Time = Test Traffic Test Stations < 2 seconds
Success Time Time attempt Tune-in - Time
Successful Tune-in
Web Radio Web Radio Reproduction Cut-off Ratio Test Traffic Test Stations < 2%
Reproduction Cut-off = (Number of Unsuccessful listening
Ratio attempts/ Total attempts) * 100%
Data Packets Test Traffic Performance Monitoring <100ms for Local IXP
Latency System <300ms for
International IXP
Test Stations
5G
Crowdsourcing <50ms for Local IXP
<100ms for
International IXP
28
SCHEDULE 5
Registration Success Registration success rate = Test traffic Performance Monitoring ≤ 98%
Rate (Successful Registration attempts/ System
Total number of Registration
attempts) *100 (ETSI TR 103 219) Test Stations
Post Dialing Delay Post Dialing Delay (PDD) = Time of Test traffic Performance Monitoring ≤4s
(PDD) ringing tone - time of dialing. System
Test Stations
Drop Call Rate Drop Call Rate = (Total number of Test traffic Performance Monitoring ≤2%
calls terminated unwillingly/ total System
number of successfully established
calls) *100% Test Stations
ITU-T Recommendation E.804
(Section7.3.6.5(
Network Efficiency Network Efficiency Ratio = Number Test traffic Performance Monitoring ≥ 95%
Ratio of seizures resulting in answer System
message, user busy, no answer /
Total number of seizures attempt) Test Stations
*100%
29
SCHEDULE 6
30
SCHEDULE 7
Service Availability Service Availability = [(Total Operational Test traffic Performance ≥ 98%
minutes - Total minutes of service downtime) Monitoring System
/ Total operational minutes] x 100%
Test Stations
Consumer >90%
satisfaction survey
Provision of Service Provision of Service = the time the customer Complaints Trouble ticket system 5 business Days
pays for service to the time the customer is
provided with service
Call Centre Operator Call Centre Operator Response = Time Test traffic Test Stations < 30 seconds
Response Operator Assistance Pick up - Time Making
Operator request.
Mean Time To Repair (MTTR) Mean Time To Repair (MTTR) = Time Service Complaints Trouble ticket system MOBILE
Restored- Time Reported Class 1 locations: 2 hrs
Class 2 locations :8 hrs.
Class 3 locations 24 hrs
FIXED:
< 5hours
INTERCONNECTION
<1 hrs
31
ANNEXURE A
SITE CLASSIFICATIONS
The following shall describe the population size per the reporting category
32