Skip to content

Panoramatic 360 degree video

Martin Piatka edited this page Aug 5, 2024 · 1 revision

UltraGrid supports capture and stitching of 360 degree video in the equirectangular projection, as well as its playback through a VR headset or on a classic display.

Stitching

Stitching of the 360 panorama is done through the gpustitch capture module, which uses the gpustitch library and is capable of stitching four 4K camera inputs into an 8K output in real time on consumer nvidia GPUs.

The capture rig needs to be calibrated ahead of time to obtain the precise relative orientation of the cameras and the distortion coefficients of the optics. The UltraGrid gpustitch module then uses those to project all cameras into one 360 degree panorama and uses multi band blending to make the seams between cameras smoother and less noticeable. Note that this approach works best if there aren't any objects moving close to the camera rig as the parallaxing will introduce artifacts.

Camera rig requirements

  • Fisheye type lenses
  • Cameras arranged in a horizontal ring
  • Sufficient overlap between images from neighboring cameras
  • (Optional) a way to ensure that cameras capture in sync

An example rig which satisfies these requirements:

  • 4x Blackmagic Micro Studio Camera 4K
  • Laowa 4mm f/2.8 Fisheye lenses
  • 360RIZE 360Helios 4

Rig calibration

The gpustitch module requires a rig specification file which contains information about the camera rig. The spec file uses TOML syntax and contains an array of tables called cameras, elements of which contain the following key/value pairs:

  • width - width of the input image in px
  • height - width of the input image in px
  • focal_len - focal length of the camera in pixels (see below)
  • yaw, pitch, roll - the rotation angles of the camera in degrees
  • distortion - an array of distortion coefficients (see below)
  • x_offset and y_offset - offset of the fisheye center from the image center

Focal length

The focal length of the camera in pixels can be calculated as focal length in milimeters * (picture width in px / sensor size in milimeters)

Distortion

The distortion coefficients can be obtained by using Hugin with photos from each camera. The values we are interested in are the a, b, c lens parameters. The last parameter is computed as 1 - (a + b + c).

Displaying

The stitched equirectangular panoramas can be then displayed using a VR headset using the openxr_gl UltraGrid display module, or in a window using the pano_gl module.

Clone this wiki locally