Media Delivery Index

The Media Delivery Index (MDI) is a set of measures that can be used to monitor both the quality of a delivered video stream as well as to show system margin for IPTV systems by providing an accurate measurement of jitter and delay at network level (Internet Protocol, IP), which are the main causes for quality loss. Identifying and quantizing such problems in this kind of networks is key to maintaining high quality video delivery and providing indications that warn system operators with enough advance notice to allow corrective action.

The Media Delivery Index is typically displayed as two numbers separated by a colon: the Delay Factor (DF) and the Media Loss Rate (MLR).

Context
The Media Delivery Index (MDI) may be able to identify problems caused by:

Time distortion
If packets are delayed by the network, some packets arrive in bursts with interpacket delays shorter than when they were transmitted, while others are delayed such that they arrive with greater delay between packets than when they were transmitted from the source (see figure below). This time difference between when a packet actually arrives and the expected arrival time is defined as packet jitter or time distortion.



A receiver displaying the video at its nominal rate must accommodate the varying input stream arrival times by buffering the data arriving early and assuring that there is enough already stored data to face the possible delays in the received data (because of this the buffer is filled before displaying).

Similarly, the network infrastructure (switches, routers,…) uses buffers at each node to avoid packet loss. These buffers must be sized appropriately to handle network congestion.

Packet delays can be caused by multiple facts, among which there are the way traffic is routed through the infrastructure and possible differences between link speeds in the infrastructure.

Moreover, some methods for delivering Quality of Service (QOS) using packet metering algorithms may intentionally hold back packets to meet the quality specifications in the transmission.

The effects of all these facts on the amount of packets received by a specific point in the network can be seen in the next graphics:

Packet loss
Packets may be lost due to buffer overflows or environmental electrical noise that creates corrupted packets. Even small packet loss rates result in a poor video display.

Description
Packet delay variation and packet loss have been shown to be the key characteristics in determining whether a network can transport good quality video. These features are represented as the Delay Factor (DF) and the Media Loss Rate (MLR), and they are combined to produce the Media Delivery Index (MDI), which is displayed as: $DF : MLR$

Components
The different components of the Media Delivery Index (MDI) are explained in this section.

Delay Factor (DF)
The Delay Factor is a temporal value given in milliseconds that indicates how much time is required to drain the virtual buffer at the concrete network node and at a specific time. In other words, it is a time value indicating how many milliseconds’ worth of data the buffers must be able to contain in order to eliminate time distortions (jitter).

It is computed as packets arrive at the node and is displayed/recorded at regular intervals (typically one second).

It is calculated as follows:

1. At every packet arrival, the difference between the bytes received and the bytes drained is calculated. This determines the MDI virtual buffer depth:

2. Over a time interval, the difference between the minimum and maximum values of Δ is taken and then divided by the media rate:

Maximum acceptable DF: 9–50 ms

Media Loss Rate (MLR)
The Media Loss Rate is the number of media packets lost over a certain time interval (typically one second).

It is computed by subtracting the number of media packets received during an interval from the number of media packets expected during that interval and scaling the value to the chosen time period (typically one second): Maximum acceptable channel zapping MLR: 0

Maximum acceptable average MLR:


 * SDTV: 0.004
 * VOD: 0.004
 * HDTV: 0.0005

It must be said that the maximum acceptable MLR depends on the implementation. For channel zapping, a channel is generally viewed for a brief period, so one would be bothered if any packet loss occurred. For this case the maximum acceptable MLR is 0, as stated before, because any greater a value would mean a loss of one or more packets in a small viewing timeframe (after the zap time).

Use
Generally, the Media Delivery Index (MDI) can be used to install, modify or evaluate a video network following the next steps:


 * 1) Identify, locate, and address any packet loss issues using the Media Loss Rate.
 * 2) Identify and measure jitter margins using the Delay Factor.
 * 3) Establish an infrastructure monitor for both MDI components to analyze any possible scenarios of interest.

Given these results, measures must be taken to provide solutions to the problems found in the network. Some of them are: redefining system specifications, modifying the network components in order to meet the expected quality requirements (or number of users), etc.

Other parameters
Other parameters may also be desired in order to troubleshoot concerns identified with the MDI and to aid in system configuration and monitoring. Some of them are:


 * Network Utilization. Tracking the instantaneous, minimum, and maximum overall network utilization is needed to verify that sufficient raw bandwidth is available for a stream on a network. High utilization level is also an indicator that localized congestion is likely due to queue behavior in network components. The DF provides a measure of the results of congestion on a given stream.
 * Video stream statistics such as:
 * Instantaneous Flow Rate (IFR) and Instantaneous Flow Rate Deviation (IFRD). The measured IFR and IFRD confirm a stream’s nominal rate and, if not constant over time, gives insight into how a stream is being corrupted.
 * Average Rate in Mbit/s. This measure indicates whether the stream’s rate being analyzed conforms to its specified rate over a measurement time. This is the longer term measurement of IFR.
 * Stream Utilization in percent of network bandwidth. This measure indicates how much of the available network bandwidth is being consumed by the stream being analyzed.