Skip to content

Compression Settings

Martin Pulec edited this page Sep 17, 2018 · 8 revisions

Table of Contents

Video compression settings

Uncompressed video

Whenever running UltraGrid with uncompressed video, it is desirable to use Jumbo frames if possible (see here):

uv -m 8500 <other_parameters> receiver

This tells UltraGrid to use Jumbo Frames (You have to have Jumbo frames enabled along the whole path, see Setup how to set it up on endpoints).

DXT Compression

To compress the video, use the option -c with one of the following parameters

  • RTDXT:DXT1 - GPU DXT1 compression - decreases bandwidth to 1/4 for YUV 4:2:2 or to 1/8 for RGBA
  • RTDXT:DXT5 - GPU DXT5 YCoCg,a bit more exacting version of DXT, with lower compression ratio, but higher quality.
  • cuda_dxt - DXT1/DXT5 using CUDA, superior performance than GLSL based RTDXT but requires a NVIDIA GPU
The resulting command can be

uv -t decklink:mode=Hi50:codec=UYVY -c RTDXT:DXT1 192.0.43.10

  • Sender - Centaurus, 8bit, compressed via RTDXT using DXT5 YCoCg compression (Linux only)
uv -t dvs -c RTDXT:DXT5 192.0.43.20
  • cuda_dxt, AV Foundation (macOS)
uv -t AV Foundation -c cuda_dxt 192.0.43.10
  • receiver (any platform/GPU)
uv -d gl

Note: To see preliminaries for RTDXT please see the preliminaries page.

GPUJPEG compression

Sending

If you have compiled JPEG support, you can use JPEG compression, eg:

uv -t decklink -c JPEG <receiver> # if DeckLink supports format detection, it can be omitted

You can also set quality and reset-interval of compression:

uv -t decklink -c JPEG:80 # <quality>, 80 is default

Lastly, you can also set CUDA device index, eg:

uv -t decklink -c JPEG:80 --cuda-device <device>

where available devices can be listed with –cuda-device help

Receiving

You do not need any special settings on the receiver. Of course, also receiver needs to be CUDA-capable and has to have compiled JPEG support.

Note on transmit errors

Please note that JPEG compression is quite sensitive to transmission errors (eg. lost packets), so if you use a lossy network, you may want to use some of FEC schemes.

H.264/HEVC and other libavcodec compression

H.264/HEVC compression is provided via libavcodec (both Libav or FFPEG supported). Some other lavc codecs are supported as well.

H.264

Usage:

uv -t deltacast -c libavcodec:codec=H.264 <address> # use H.264

uv -t deltacast -c libavcodec:codec=H.264:bitrate=20M <address> # specifies requested bitrate

uv -t deltacast -c libavcodec:codec=H.264:subsampling=420 <address> # use subsampling 420 (default: 422 for interlaced, 420 for progressive)

Using NVENC compression (NVIDIA only):

uv -t deltacast -clibavcodec:encoder=h264_nvenc<address>

Use CUVID (HW accelerated) decoder:

uv -d gl --paramforce-lavd-decoder=h264_cuvid

10-bit H.264 compression (experimental)

In following setup, sender must capture v210 codec, display must be able to display directly v210 (most of SDI cards are). * sender (Ubuntu/Debian only, for other Linux distribution use actual path to 10-bit x264)

LD_PRELOAD=/usr/lib/x86_64-linux-gnu/x264-10bit/libx264.so.148 uv -t testcard:1920:1080:25:v210 -c libavcodec:encoder=libx264 --param lavc-use-codec=yuv422p10le <receiver_addr>

  • receiver
uv -d decklink --param lavd-use-10bit

HEVC

Since HEVC is relatively new and still a bit demanding compression providing a great compression ratio, you may need to tweak things a bit to achieve optimal performance.

There are multiple encoders supporting HEVC encoding, namely libx265 and hevc_nvenc (and also hevc_qsv if available). While encoding HEVC is still a bit demanding, it is advisable to use the NVENC encoder (or QSV) to encode the stream:

uv -t deltacast -c libavcodec:encoder=hevc_nvenc <address>

Currently, the stream encoded by NVENC encoder isn’t much parallelizable by decoder, so you may want to force hardware decoder (please note that the decoder currently adds some 4 frames of latency!):

uv -d gl --param force-lavd-decoder=hevc_cuvid

Alternatively, you may reduce the bit rate, 10 or 15 Mbps is a way easier to decode:

uv -t deltacast -c libavcodec:encoder=hevc_nvenc:bitrate=15M <address>

You can use also software encoder, which can be a bit slowish, however decoder parallelizes easily.

uv -t deltacast -c libavcodec:encoder=libx265 <address>

Once you have both hardware encoder and decoder, you can turn on spatial AQ to improve the image quality (of course it can be used also along with SW decoder, however decoding is then a bit more computationally demanding).

uv -t deltacast -c libavcodec:encoder=hevc_nvenc:spatial_aq=1 <address>

10-bit HEVC compression (experimental)

In following setup, sender must capture v210 codec, display must be able to display directly v210 (most of SDI cards are).

  • sender
uv -t testcard:1920:1080:25:v210 -c libavcodec:encoder=libx265 –param lavc-use-codec=yuv422p10le <receiver_addr>
  • receiver
uv -d decklink --paramlavd-use-10bit

Other HW acceleration

VAAPI/VDPAU accelerated decoding (if supported) can be toggled with following command:

uv -d decklink --paramuse-hw-accel

HW accelerated encoding is toggled by sellecting appropriate encoder, eg. hevc_vaapi or hevc_nvenc.

Other compressions

uv -t deltacast -c libavcodec <address> # use default libavcodec codec (currently MJPEG)

uv -t deltacast -c libavcodec:codec=MJPEG <address> # use MJPEG codec instead

Audio compression settings

Audio compression scheme is selected by –audio-codec option. Currently, libavcodec-supplied compressions are supported. Depending on you libavcodec build, you can use following codecs:

  • AAC
  • A-law
  • FLAC
  • G.722
  • MP3
  • OPUS
  • PCM
  • speex
  • u-law
The preferred codec is OPUS which provides decent quality at low bitrates. You can use SPEEX as well.

There are two options that can be passed with the codec:

  • sample_rate - specifies sampling rate to be resampled to before passing to audio codec. This has two results - lower sampling rate results in lower bitrate. Secondly, some codecs (eg. speex) support only limited set of sampling rates (here 32000/16000/8000), so you need to resample to this samplerate if you want to use such codec. Source sampling rate is usually 48 kHz.
  • bitrate - for codec that support this setting, you may set this option (currently only OPUS)
Some examples:

uv --audio-codec A-law:sample_rate=8000 # standard 8 kHz A-law compression used in European digital communication (similarly u-law for North America and Japan). Results in 64 kbps dataflow suitable to transmit speech

uv --audio-codec OPUS # high-quality audio codec

uv --audio-codec OPUS:bitrate=64000 # similar as previous, use 64 kbps

uv --audio-codec speex:sample_rate=32000

Clone this wiki locally