-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RTP streaming issues with the DJI Mini SE #1
Comments
I'm aware of the project, but haven't checked it out recently. Its been a couple years, but I think I can catch up when I have free time.
Well well well.... I've been encountering this same error working on the DJI side too. Same stack : a project I'm working on has RTSP, and we see some choppiness coming in from the video on the DJI side. (Im working on this same issue in one of my DJI projects actually. So I'll clue you in...) Here's what you can do :
Unfortunately I can't share source due to proprietary stuff I'm working on :/ Best I can do is share clues from the above. |
Hi @roguestarslade, I forked your repo and added code for RTP streaming with a minimal commit: RosettaDrone@9d31891 This code is supposed to be working fine for other models, but not for the Mini. |
What I found strange is that the sample code obtains a keyframe using this magical
|
I updated the code to enable streaming RTP with Unicast (Multicast didn't work for us). In our code, we split the NALs to solve the problem of sending to big UDP packets, which are automatically splitted and received without headers. In this commit, I added a boolean "splitNALs" that can be modified using the debugger to disable the NAL splitting during runtime. |
hey @belveder79 check this out. |
hmmm... @roguestarslade my assumption is that at this point Line 168 in fe6fa10
the stream from the Drone was received correctly, which means that the buffer contains valid NAL units (and only complete NAL units). What we do next is hand that buffer over to a streaming library, and this library creates a new RTP stream out of that and wraps it into RTSP, therefore the underlying library is responsible for not overshooting the packet size that can be reliably used over TCP or UDP. There is no custom code required to split NAL units... if the splitting in the streaming library itself wouldn't work (respectively the assembly on the receiving end), you would never get a frame at all... Now what is an obvious issue is that if TCP is not used, the packets might get lost or the order is not preserved, which is a network issue and you can't really do anything about it. Assuming that your receiving device is actually close to the stream destination or you have a good quality of connection between the receiving device (the DJI app) and your final end point, having issues like that are unlikely to occur imho. What is less obvious (and I did not dive into that too much to be honest) is that my assumption about the status of the stream at the above mentioned position is wrong. Say there are valid NAL units, but there are also incomplete NAL units, as - for whatever reason - it is just a snapshot of the underlying receive buffer from DJI. Whatever is after the last valid NAL unit is discarded then and you would loose some data (namely exactly the unit that falls on the border between two calls to handover message to the streaming library). This is actually a pretty unusual behaviour that would never occur if you get the stream straight from an encoder, but who knows? DJI-Android-VideoStreamDecodingSample/android-videostreamdecodingsample/jni/dji_video_jni.c Line 181 in fe6fa10
I will have a look at this... probably this is what happens at least on our side... |
Yes, the frames are correctly enqueued, dequeued, decoded and presented on the screen. Even when you discard 59 frames from 60, you will still get a perfect video on the screen. Have you tried my fork? What aircraft model are you using? |
No I did not as I was just on the project about the streaming... what do you mean by discarding 59 out of 60 frames? Are you talking about NAL units? |
We did some tests just ignoring the callback that receives the buffer from the SDK (eg. just processing the data once each 60 times), and the video was still showing almost perfect on the screen. My understanding is that this buffer contains multiple NAL units and that they must be split before sending them via UDP. |
sorry, I think that there is a misunderstanding or we talk past each other, because what you are saying does not make any sense to me... of course you need to split NAL units, even individual ones, such that they can be put into RTP packets meeting the requirements for reliable network transfer... however, you cannot just skip calls to DJI-Android-VideoStreamDecodingSample/android-videostreamdecodingsample/jni/dji_video_jni.c Line 114 in fe6fa10
and claim to get a perfect video, otherwise the data would have to be redundant, which it certainly is not... |
We are parsing in the same way as the example. We split and send the NALUs on the message dequeue code just before the buffer is sent to the MediaCodec decoder for the onscreen rendering. I'm not saying that the data is redundant, but confirming that the video buffer if received correctly, that it is tolerant and that whatever problem we may experience on the RTP receiver is not caused by missing or dropped packets but by some kind of corruption we are introducing probably in the NALU splitting process which may be incompatible with the specific format of the Mini SE (other models work fine). Please note I don't know how you are RTP streaming and haven't seen your code. |
Ah, ok, now I got it... Quickly browsing through your code I realized that you never call what does the |
Are you receiving the stream correctly? What model are you testing? |
We can receive the stream from a Mavic 2 Pro and Mini 2 |
Thanks. Can you please confirm you are using my fork (https://github.com/kripper/DJI-Android-VideoStreamDecodingSample) and which gstreamer command line you used to receive the stream? |
@roguestarslade can you please confirm if you are referring to my fork and which gstreamer command line you used for receiving the stream? Thanks. |
@roguestarslade Just in case, I added detailed instructions how to test the RTP Streaming: |
Hi @roguestarslade,
Have you tried Rosetta Drone?
https://github.com/The1only/rosettadrone
I came to your fork because I'm working on fixing the video stream decoding for the Mini.
We are receiving the decoded buffer on
onDataRecv()
and then send it via RTP.But I believe we are doing something wrong in this process:
RosettaDrone/rosettadrone#27
The text was updated successfully, but these errors were encountered: