-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model intra-frame events as deltas from frame start time #36
Comments
Note that I totally made up the example numbers, and general concept of what's going on here from an LED perspective. If you have sensor expertise please recommend corrections. |
We should agree on what "frame start time" means, or decide that each manufacturer can define their own as long as its used consistently |
I am trying to capture that from an IC-VFX perspective we want sensor exposures and LED display 'lit' time to line up relative to system/house sync. You're right we likely need more definition there: what does the camera and display do relative to an incoming sync signal vs. what data/metadata is being generated in response. |
To confirm, the only important information is the offset from frame start, right? If so, why is absolute time important? |
Just trying to incorporate the discussion from today re: PTP time in this data. From my perspective (ICVFX) we should likely only need to know offsets relative to a sync pulse. |
Is the absolute time important for PTP to replace timecode , or are there other techniques for that? |
There are definitely several approaches to this I think. If you know all the sensor parameters, and you know the exposure offset from a sync pulse (both as Ritchie said) than you can model the current status of the scan locally, even in an online context. The toolkit might have a static function that takes a struct of these values, and returns the current line being scanned and how long it's been scanning for. Alternatively, the camera could perhaps send an 1D array of 0.0-1.0 float values that is the size of the vertical resolution, representing the exposure time of each line. This isn't far off from how some camera APIs send histograms in realtime. |
@MadlyFX Any example data available? |
For VFX and OSVP realtime use, we would like to know intra-frame timing of the sensor exposure.
In VFX I believe this is helpful to characterize horizontal motion skew caused by rolling shutter delays.
In OSVP this helps to ensure that the (LED) display system is active while the shutter is open/active to reduce banding artifacts.
Per Tucker's comments, it does seem helpful to use the 'metadata' terminology for static data, and 'data' for dynamic fields that are expected to meaningfully change over time.
Dynamic data
Static metadata
Note that this does not model multi-exposure "HDR" modes.
Example
2160p23.98 180° rolling shutter example (made up numbers):
41700μs total frame period =
=> total integration time of 20850μs + 4320μs => 25170μs
=> start delay between lines 4320μs / 2160 lines => 2μs
=> shutter closed time 41700μs - 25170μs => 16530μs
The text was updated successfully, but these errors were encountered: