-
Notifications
You must be signed in to change notification settings - Fork 137
EPNP (Poser)
The lighthouse and sensor system fits nicely into the PNP problem domain.
From the light data, we can derive the angle on each sweep plane from (roughly) the center of the lighthouse to the sensor. Both planes gives us a set of two angles. These angles describe a line in space which intersects both the sensor and the lighthouse. Since both planes are thought of as intersecting a single point, those angles typically fall in the range of [-POV/2, POV/2] -- although the fact that they are centered is somewhat arbitrary and set that way for convienence.
This tracks loosely with that of a pinhole camera model. However, it does not track exactly. In a typical application of PNP to pinhole cameras, you are given pixels for correspondence whereas here we have angles. This makes the math much simpler.
Whereas with a pinhole camera you need to calibrate a camera matrix; There is no sensor lens in our case, which means there are no pixel values per se. However, if you imagine a plane exactly 1 meter (or whatever unit your 3d points are in) in front of the lighthouse, you can solve for where the line would intercept that plane from the given angle.
Recall that
tan(angle) = opposite / adjacent
tan(angle) * adjacent = opposite
and since the adjacent line -- that of the center of the plane to the center of the lighthouse -- was set at 1 meter, the position for each coordinate is simply:
px = tan(ang_x)
py = tan(ang_y)
Note that these aren't technically pixel values, but can effectively be used as such. Also note that since we defined our plane at 1m out, this is our focal length and the camera matrix is simply identity.
Typically PNP implementations will also allow you to model radial distortions and the like. While the lighthouses are not idealized systems, I somewhat doubt that the distortions seen in cameras track particularly closely with that of the lighthouse, so I recommend not using them at all.