Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model intra-frame events as deltas from frame start time #36

Open
repentsinner opened this issue Sep 1, 2022 · 8 comments
Open

Model intra-frame events as deltas from frame start time #36

repentsinner opened this issue Sep 1, 2022 · 8 comments

Comments

@repentsinner
Copy link

repentsinner commented Sep 1, 2022

For VFX and OSVP realtime use, we would like to know intra-frame timing of the sensor exposure.

In VFX I believe this is helpful to characterize horizontal motion skew caused by rolling shutter delays.

In OSVP this helps to ensure that the (LED) display system is active while the shutter is open/active to reduce banding artifacts.

Per Tucker's comments, it does seem helpful to use the 'metadata' terminology for static data, and 'data' for dynamic fields that are expected to meaningfully change over time.

Dynamic data

  • Frame start time (PTP timestamp or more likely genlock sync, dynamic)

Static metadata

  • Offset to first line exposure start (static, float, μs)
  • Line exposure duration [exposure period] (static, float, μs)
  • Shutter skew [offset from start of first line exposure to start of last line exposure] (static, float, μs)
  • Number of lines (static, integer, count)
  • Line exposure direction (static, enum, 'bottom-to-top', 'top-to-bottom', 'left-to-right', 'right-to-left') (Note that we probably want to confirm if this is relative to the sensor, relative to the world, or other. Also need to confirm with the sensor vendors if this is a complete list, or if there is some other rolling shutter exposure scheme out there).

Note that this does not model multi-exposure "HDR" modes.

Example

2160p23.98 180° rolling shutter example (made up numbers):

41700μs total frame period =

  • 0μs offset to exposure start
  • 20850μs line read
  • 4320μs shutter skew
  • 2160 lines
  • top-to-bottom

=> total integration time of 20850μs + 4320μs => 25170μs
=> start delay between lines 4320μs / 2160 lines => 2μs
=> shutter closed time 41700μs - 25170μs => 16530μs

@repentsinner
Copy link
Author

Note that I totally made up the example numbers, and general concept of what's going on here from an LED perspective. If you have sensor expertise please recommend corrections.

@umathurred
Copy link

We should agree on what "frame start time" means, or decide that each manufacturer can define their own as long as its used consistently
If we want to be uniform, I suggest start of frame readout, rather than start of exposure, to be the frame start time.

@repentsinner
Copy link
Author

I am trying to capture that from an IC-VFX perspective we want sensor exposures and LED display 'lit' time to line up relative to system/house sync.

You're right we likely need more definition there: what does the camera and display do relative to an incoming sync signal vs. what data/metadata is being generated in response.

@palemieux
Copy link
Member

To confirm, the only important information is the offset from frame start, right?

If so, why is absolute time important?

@repentsinner
Copy link
Author

Just trying to incorporate the discussion from today re: PTP time in this data. From my perspective (ICVFX) we should likely only need to know offsets relative to a sync pulse.

@umathurred
Copy link

Is the absolute time important for PTP to replace timecode , or are there other techniques for that?

@MadlyFX
Copy link

MadlyFX commented Sep 2, 2022

There are definitely several approaches to this I think. If you know all the sensor parameters, and you know the exposure offset from a sync pulse (both as Ritchie said) than you can model the current status of the scan locally, even in an online context. The toolkit might have a static function that takes a struct of these values, and returns the current line being scanned and how long it's been scanning for.

Alternatively, the camera could perhaps send an 1D array of 0.0-1.0 float values that is the size of the vertical resolution, representing the exposure time of each line. This isn't far off from how some camera APIs send histograms in realtime.

@palemieux
Copy link
Member

@MadlyFX Any example data available?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants