-
Notifications
You must be signed in to change notification settings - Fork 58
Compression Settings
Whenever running UltraGrid with uncompressed video, it is desirable to use Jumbo frames if possible (see here):
uv -m 8500 <other_parameters> receiver
This tells UltraGrid to use Jumbo Frames (You have to have Jumbo frames enabled along the whole path, see Setup how to set it up on endpoints).
To compress the video, use the option -c
with one of the following parameters
-
FastDXT
- DXT1 compression on CPU -
RTDXT:DXT1
- GPU DXT1 compression - decreases bandwidth to 1/4 for YUV 4:2:2 or to 1/8 for RGBA -
RTDXT:DXT5
- GPU DXT5 YCoCg,a bit more exacting version of DXT, with lower compression ratio, but higher quality.
The resulting command
./uv -t decklink:0:6:2vuy -c RTDXT:DXT1 192.0.43.10
- CPU DXT1 compression, QuickTime
./uv -t quicktime:1:12:13 -c FastDXT 192.0.43.10
- Sender - Centaurus, 8bit, compressed via RTDXT using DXT5 YCoCg compression (Linux only)
./uv -t dvs:37:UYVY -c RTDXT:DXT5 192.0.43.20
- receiver (any platform/GPU)
./uv -d gl
Decompression is done automatically, but the reciever is required to use the OpenGL display.
Note: To see preliminaries for RTDXT please see the preliminaries page.
Note 2: With Intel graphics card on Linux you need to explicitly enable DXT support which is disabled in the open-source driver for patenting issues. This can be done eg. by installing libtxc-dxtn-s2tc0 package on Ubuntu.
If you have compiled JPEG support, you can use JPEG compression, eg:
uv -t decklink -c JPEG <receiverIP> # if DeckLink supports format detection, it can be omitted
You can also set quality and reset-interval of compression:
uv -t decklink -c JPEG:80 # <quality>, 80 is default
Lastly, you can also set CUDA device, eg:
uv -t decklink -c JPEG:80 --cuda-device <device>
where available devices can be listed with --cuda-device help
You do not need any special settings on the receiver. Of course, also receiver needs to be CUDA-capable and has to have compiled JPEG support.
Please note that JPEG compression is quite sensitive to transmission errors (eg. lost packets), so if you use a lossy network, you may want to use some of FEC schemes.
H.264/HEVC compression is provided via libavcodec (both Libav or FFPEG supported). Some other lavc codecs are supported as well.
Usage:
uv -t deltacast -c libavcodec:codec=H.264 <address> # use H.264
uv -t deltacast -c libavcodec:codec=H.264:bitrate=20M <address> # specifies requested bitrate
uv -t deltacast -c libavcodec:codec=H.264:subsampling=420 <address> # use subsampling 420 (default: 422 for interlaced, 420 for progressive)
Using NVENC compression (NVIDIA only):
uv -t deltacast -c libavcodec:encoder=h264_nvenc <address>
Use CUVID (HW accelerated) decoder:
uv -d gl --param force-lavd-decoder=h264_cuvid
In following setup, sender must capture v210 codec, display must be able to display directly v210 (most of SDI cards are).
- sender (Ubuntu/Debian only, for other Linux distribution use actual path to 10-bit x264)
LD_PRELOAD=/usr/lib/x86_64-linux-gnu/x264-10bit/libx264.so.148 uv -t testcard:1920:1080:25:v210 -c libavcodec:encoder=libx264 --param lavc-use-codec=yuv422p10le <receiver_addr>
- receiver
uv -d decklink --param lavd-use-10bit-h264
Since HEVC is relatively new and still a bit demanding compression providing a great compression ratio, you may need to tweak things a bit to achieve optimal performance.
There are multiple encoders supporting HEVC encoding, namely libx265 and hevc_nvenc (and also hevc_qsv if available). While encoding HEVC is still a bit demanding, it is advisable to use the NVENC encoder (or QSV) to encode the stream:
uv -t deltacast -c libavcodec:encoder=hevc_nvenc <address>
Currently, the stream encoded by NVENC encoder isn't much parallelizable by decoder, so you may want to force hardware decoder (please note that the decoder currently adds some 4 frames of latency!):
uv -d gl --param force-lavd-decoder=hevc_cuvid
Alternatively, you may reduce the bit rate, 10 or 15 Mbps is a way easier to decode:
uv -t deltacast -c libavcodec:encoder=hevc_nvenc:bitrate=15M <address>
You can use also software encoder, which can be a bit slowish, however decoder parallelizes easily.
uv -t deltacast -c libavcodec:encoder=libx265 <address>
Once you have both hardware encoder and decoder, you can turn on spatial AQ to improve the image quality (of course it can be used also along with SW decoder, however decoding is then a bit more computationally demanding).
uv -t deltacast -c libavcodec:encoder=hevc_nvenc:spatial_aq=1 <address>
uv -t deltacast -c libavcodec <address> # use default libavcodec codec (currently MJPEG)
uv -t deltacast -c libavcodec:codec=MJPEG <address> # use MJPEG codec instead
Audio compression scheme is selected by --audio-codec option. Currently, libavcodec-supplied compressions are supported. Depending on you libavcodec build, you can use following codecs:
- AAC
- A-law
- FLAC
- G.722
- MP3
- OPUS
- PCM
- speex
- u-law
The preferred codec is OPUS which provides decent quality at low bitrates. You can use SPEEX as well.
There are two options that can be passed with the codec:
- sample_rate - specifies sampling rate to be resampled to before passing to audio codec. This has two results - lower sampling rate results in lower bitrate. Secondly, some codecs (eg. speex) support only limited set of sampling rates (here 32000/16000/8000), so you need to resample to this samplerate if you want to use such codec. Source sampling rate is usually 48 kHz.
- bitrate - for codec that support this setting, you may set this option (currently only OPUS)
Some examples:
uv --audio-codec A-law:sample_rate=8000 # standard 8 kHz A-law compression used in European digital communication (similarly u-law for North America and Japan). Results in 64 kbps dataflow suitable to transmit speech
uv --audio-codec OPUS # high-quality audio codec
uv --audio-codec OPUS:bitrate=64000 # similar as previous, use 64 kbps
uv --audio-codec speex:sample_rate=32000
If you have any technical or non-technical question or suggestion please feel free to contact us at