Using Pylon capture to track an LED #997
Replies: 4 comments 2 replies
-
Hey, Can you please include the Bonsai workflow you are currently using? Additionally, can you include details of your debugging experiment such as:
Cheers, |
Beta Was this translation helpful? Give feedback.
-
* How do you planning on tracking the LED? (Maybe include a short video clip?)
Bonsai is being used to track the LEDs using the attached workflow. Essentially, we are using bonsai to only consider the red pixels, then look for the largest region of red and find the centroid. That data is then transferred to the OpenEphys via the OSC outputs that are read by the tracking plugin. We also have Bonsai outputting CSV data with the position and timestamps.
My observations are that most of the CSV rows indicating the positions are zero or NaNs, but there is one row per frame captured. I also cannot understand how to interpret the `packet_timestamps.csv` file. The intervals between the entries are inconsistent and any choice of units [ns, µs, ms, s] do not make sense.
When I captured the frames from PylonViewer (camera manufacture’s software), I noticed that that default settings (with a frame rate of 50Hz) resulted in numerous errors: "Fatal error: The buffer was incompletely grabbed, this can be caused by performance pro elm of the network hardware used" https://www.baslerweb.com/en/sales-support/knowledge-base/frequently-asked-questions/what-does-the-error-code-3774873620-0xe1000014-mean/15246/. We eliminated the errors by increasing the inter-packet delay as Basler recommended. This was effective in reducing the lag observed during recording, but did not significantly decrease the NaN or zero entries in Bonsai’s output CSV for position.
* How are you generating the TTLs?
We have a Basler aca1300_75gc that is outputting TTLs on a GPIO line that feeds directly into the OpenEphys I/O connector board (see attached pfs file). We see the TTLs in the LFP display as highlighted regions, so we're sure we’re getting them.
* How are you making sure the number of TTLs sent is the correct one?
The TTL pulses recorded by OpenEphys are evenly spaced in time, consistent with the FPS rate I chose in the camera settings (psf) file.
* Have you managed to acquire the correct number of frames using the same TTL strategy but recording with the camera's provided software?
The camera’s provided software ***@***.***) does not have a way to see that the TTLs that are output from the GPIO line. I don’t know how to get this information into the computer. The GPIO is a ±5V digital signal delivered through a wire without any sort of connector. The only interface I have is the OpenEphys I/O board where I soldered the signal and ground to a BNC plug. I can currently only see the TTL via OpenEphys.
Lilliana M. Sanchez, M.S.
Doctoral Student Researcher
The University of New Mexico
Department of Psychology
Albuquerque, NM, 87131
researchgate.net/profile/Lilliana_Sanchez<http://researchgate.net/profile/Lilliana_Sanchez>
On Aug 8, 2022, at 12:26 AM, brunocruz ***@***.******@***.***>> wrote:
[EXTERNAL]
Hey,
Can you please include the Bonsai workflow you are currently using? Additionally, can you include details of your debugging experiment such as:
* How do you planning on tracking the LED? (Maybe include a short video clip?)
* How are you generating the TTLs?
* How are you making sure the number of TTLs sent is the correct one?
* Have you managed to acquire the correct number of frames using the same TTL strategy but recording with the camera's provided software?
Cheers,
B
—
Reply to this email directly, view it on GitHub<#997 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AYCY3IO7U3GLMGXRW6BFZYDVYCSBPANCNFSM552WEBPA>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Heya. Regarding the CSV writer, It would really helpful if you could include your current workflow. |
Beta Was this translation helpful? Give feedback.
-
I believe that this is what you are looking. Second, not the number of frames or the video duration but physically the view or what we can see is half.
Thank you,
Lilliana
Lilliana M. Sanchez, M.S.
Doctoral Student Researcher
The University of New Mexico
Department of Psychology
Albuquerque, NM, 87131
researchgate.net/profile/Lilliana_Sanchez<http://researchgate.net/profile/Lilliana_Sanchez>
On Aug 10, 2022, at 3:18 AM, brunocruz ***@***.******@***.***>> wrote:
Additionally, "However, with that module, the video (uncropped) was only half of what the camera was capturing in PylonViewer.". Do you mean when looking at the video duration or number of frames?
|
Beta Was this translation helpful? Give feedback.
-
I am using Bonsai to track an LED. Most of the coordinates output from the tracking algorithm were NaNs. Also, there are too few frames output compared to the number of frames that should be there based on the frame rate and the number of TTL's we received. I've narrowed the problem down to the PylonCapture module itself.
I've seen it recommended that using the CameraCapture module could help. However, with that module, the video (uncropped) was only half of what the camera was capturing in PylonViewer. As I can't actually have the camera connected to Bonsai and the PylonView program at the same time, I can't use PylonViewer software to control the camera and emit a TTL. I've got the frame rate down to 60 Hz, so it shouldn't be causing much of an issue with bandwidth.
Do you have any suggestions for how to work around the issue?
My settings file is tracking_ephys_working.xml.zip
Beta Was this translation helpful? Give feedback.
All reactions