Skip to content

Device Settings

Martin Pulec edited this page Nov 29, 2024 · 18 revisions

On this page there is an incomplete list of example device usage.

Table of Contents

Audio

Portaudio/ASIO

Using Portaudio with ASIO backend (Windows) works, but it requires audio playback to be run from within different process than audio capture:

uv -s portaudio --audio-codec OPUS 192.0.2.2
uv -r portaudio 192.0.2.2

works. However the following:

uv -s portaudio --audio-codec OPUS -r poraudio 192.0.2.2 # playback initialization will fail

doesn't work.

Video

AJA

HDMI (Io 4K Plus)

Use of AJA is not problematic but one thing that needs to be taken into account is the use of HDMI because channels do not map straightforwardly to device channels. On contrary, SDI 1-n map naturally to channel 1-n. For HDMI, user has to specify the channel manually, following works for Io 4K Plus:

uv -d aja:connection=HDMI:channel=1 -t aja:connection=HDMI1:channel=2 -c <compression> <receiver>

Also, if in the above example you to change input cable from HDMI to SDI1, you need to leave the channel specified explicitly for the capture and playback not to struggle for the same frame store buffer:

uv -d aja:connection=HDMI:channel=1 -t aja:connection=SDI1:channel=2 -c <compression> <receiver>

Notes

  • audio output over HDMI doesn't work
  • video output for HDMI works only on channel 1

RGB over SDI

Capture

When SDI signal is in RGB, you need to pass a parameter RGB to indicate that explicitly:

uv -t aja:RGB

This sets input to RGBA as the captured codec, which is equivalent to (input signal is then expected RGBA):

uv -t aja:codec=RGBA

If you want to capture 10-bit RGB, set the codec to R10k:

uv -t aja:codec=R10k

You can also let the card convert input RGB signal to YCbCr if you think it is reasonable:

uv -t aja:RGB:codec=UYVY # (or v210)

or vice versa:

uv -t aja:YUV:codec=RGBA

On contrary, the following won't work provided that the input signal is in YCbCr (you must set the input color space explicitly, see above):

uv -t aja:codec=RGBA

To sum up - when nothing is specified, codec is expected to be UYVY and therefore input as YCbCr. To capture RGB, you need to set either codec option or RGB. You can also combine the input and output codec options to convert the color space provided that is correctly set the color space which is on the wire.

Display

For SDI output, by default the card respects the incoming signal. Similarly to AJA capture, you can override the default setting by adding RGB or YUV option.

uv -t aja:RGB # force RGB SDI output
uv -t aja:YUV # force SDI output in YCbCr

DeckLink

Simplest use is:

uv -t decklink -c <compression> <receiver>

and

uv -d decklink <sender> # display

The capture command expects the use of first card that supports input format autodetection. If card doesn't support mode detection, you will either need to pass it as an argument:

uv -t decklink:mode=Hi50 -c <compression> <receiver> # for 1920x1080@50i

or you may let UltraGrid detect the correct format:

uv -t decklink:detect-format

There are plenty of other options that may be used, see:

uv -t decklink:help
uv -d decklink:help

DeckLink IP

For SMPTE 2110 DeckLink devices, you can specify the Ethernet as the connection:

uv -t decklink:conn=Ethernet

Additionally you may need to set configuration values like the IP address. This can be changed either with Blackmagic's BlackmagicDesktopVideoSetup tool or directly with UltraGrid:

uv -d "decklink:nsip=10.0.0.3:nssm=255.255.255.0:nsgw=10.0.0.1:\
noaa=239.255.194.26\:16384:noav=239.255.194.26\:163888"

As for other options set by UG, the values are set run-time only; the "-quotation is necessary here to retain escaped colon in \:.

You can list some options by:

uv -d decklink:help=FourCC

You could also need the NMOS controller, see this references:

Synchronous A/V output

If strictly synchronous playback is requested, there exist a synchronized mode for DeckLink, that can be toggled for DeckLink display. As a sender needs to be either DeckLink, file or testcard - the other devices doesn't timestamp the input properly so it is not guaranteed to work. Also audio filters are not recommended because it remove timestamp (write us if you need any!)

Usage:

sender$ uv -t decklink -s embedded <receiver>
receiver$ uv -d decklink:sync -r embedded

For option documentation you can see -d decklink:fullhelp.

It is presumably incompatible with :drift_fix option at this time. It also may not perform good if the clock is drifting significantly between sender and receiver DeckLinks (although it may be possible to be at least partially tweaked by the provided options - increasing p can help if sender clock is faster, b in the opposite case).

Note: The synchronized mode adds additional latency - additional 100 ms might be approximately the expected value with default settings.

File

file capture allows arbitrary AV-file supported by libavcodec to be played. Usage:

uv -t file:my_movie.mp4 -c libavcodec <receiver>
uv -t file:my_movie.mp4 -c libavcodec -s embedded --audio-codec OPUS <receiver>

For more options, issue as usual:

uv -t file:help

See also: Recording and Playback

Screen Capture

see Screen Capture

Webcam

For generic webcam you need to use a generic API native to your platform.

Linux

In Linux use the V4L2 API. Eg.:

uv -t v4l2[:help]

You can specify a device:

uv -t v4l2:device=/dev/video1

If not specified, V4L2 uses a default (usually last used) settings. That are those the ones marked with * (a star) in -t v4l2:help

Another options worth mentioning are:

  • codec - codec to use. May be simply a pixel format or a compressed one. Note that the compressed ones cannot be further processed or recompressed. If this is desirable, see RGB option below.
  • size - the resolution. Can be either one of list or for some cameras an arbitrary one. (Those cameras have continuous space in which the resolution can be chosen.)
  • fps - frames per second to be use
  • RGB - forces input conversion to RGB. This may be useful in cases when you have an USB 2.0 camera that supports FullHD. The link speed of USB 2.0 is however lower than the required for uncompressed 1920x1080@30 stream. Therefore you may want to use MJPG for transport and force decoding to RGB in order to allow further processing:

uv -t v4l2:codec=MJPG:size=1920x1080:fps=30:RGB -c libavcodec:codec=HEVC <receiver>

macOS

In macOS use AVFoundation to capture from webcams and other supported devices.

Basic usage:

uv -t avfoundation[:device=<idx>]
uv -t avfoundation:help

Mode specification can be done in two ways:

  • using preset - use one of "low", "medium", "high", "VGA" or "HD"
  • setting the mode explicitly - use the one mode (and fps) from a list shown by -t avfoundation:help

Windows

In Windows use the dshow device:

uv -t dshow[:help]

You can specify a device index (obtained with dshow:help) and a mode, eg.:

uv -t dshow:device=1:mode=5

You can also specify that you want to capture RGB (and perhaps perform conversion to that). Otherwise, the codec native to the mode is used.

Other

Raspberry Pi

Raspberry Pi 4 video decoding

Click here for old /dev/rpivid-hevcmem based solution

This section is about the old and deprecated acceleration. To use this you need an old enough raspberry kernel which still exposes the /dev/rpivid-hevcmem device (version 5.10 probably works) and old enough UltraGrid build which still has this implemented (the 1.9 is the last stable branch containing this).

Requirements

To build and run UltraGrid with rpi4 hw. decoding support, a patched ffmpeg, libmmal and libbcm_host are required. On Raspbian the the ffmpeg package already contains the required patches. Libmmal and libbcm_host are provided in the libraspberrypi-dev package.

Rpi4 display

The hardware video decoder decodes frames into a custom pixel format (SAND). To efficiently display frames in this format, they need to be passed to the gpu using the mmal API. A special rpi4 UltraGrid video display was implemented for this purpose.

Please note, that video displayed in this manner completely side-steps around the standard graphical environment, which means that it will not be visible on screenshots, gets drawn on top of everything else (even the mouse cursor) and cannot be minimized or moved. The video dimension and position are however controllable using the force-size and position arguments of the rpi4 display.

To use hw. decoding:

uv -d rpi4 --param use-hw-accel

To build and run UltraGrid with rpi4 hw. decoding support, a patched ffmpeg is required. On Raspbian the the ffmpeg package already contains the required patches.

For best performance it is recommended to display the decoded video using the drm video display. When using this display no graphic environment (X11, Wayland, etc.) can be running. The drm display also doesn't support any scaling at the moment, so the display device connected to the rpi needs to support a video mode appropriate for the received video.

Note: You may need to increase GPU memory to 128 or 256 MB in order to get rid of eventual MMAL errors).

Note: Only video with a subsampling of 4:2:0 is supported.

Platform

Following instructions work with Raspberry Pi 3. There may be differences with other Raspberry devices.

Capture and encode

You may want to check with vcgencmd which HW codecs have you enabled:

for codec in H263 H264 MPG2 WVC1 MPG4 AGIF MJPA MJPB MJPG WMV9 MVC0; do
    echo -e "$codec:\t$(vcgencmd codec_enabled $codec)"
done

For capture, the module needs to be loaded:

sudo modprobe bcm2835-v4l2

(alternatively you may add bcm2835-v4l2 to /etc/modules).

Display configuration

For display, an accelerated GL driver is needed and number of video memory, add the following to /boot/config.txt (or use "raspi-config -> 7 -> A3,A8" to configure):

[all]
# 64/128 MB may also work
gpu_mem=256
# G3 GL (Full KMS)
dtoverlay=vc4-kms-v3d

Video compression

If not configured by default (not needed for eg. Raspbian), you need to configure FFmpeg with following flags to support HW-accelerated encoding and decoding:

./configure --enable-mmal --enable-omx-rpi --enable-omx

Encoding

To use HW-acceleration, use the encoder h264_omx:

uv -t testcard:1280:720:30:UYVY -c libavcodec:encoder=h264_omx

Note: There was an issue that the compressed stream contained

Decoding

You need to specify explicitly the hardware accelerated decoder:

uv --param force-lavd-decoder=h264_mmal -d gl

Note: You may need to increase GPU memory to 128 or 256 MB in order to get rid of eventual MMAL errors).

Capture

Capture can be done with the V4L2 module, either in H.264 or MJPEG.

Display

Currently you'd need to use OpenGL to achieve acceptable performance, SDL is currently slow.

Builds

It is recommended to use 32-bit AppImage for Rasberry Pi even if the platform is 64-capable because 64-bit AppImages are built without Raspberry-specific extensions.

Running AppImage

You need to have fuse in order to AppImage to run:

$ sudo apt-get -y install fuse
$ sudo modprobe fuse # may not be needed

Then, make the downloaded AppImage executable:

$ sudo chmod u+x UltraGrid*AppImage

After that, you may normally run that AppImage:

$ ./UltraGrid*AppImage

Clone this wiki locally