-
-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configure CameraRtmpLiveStreamer to write to a file in parallel to the streaming to RTMP server? #63
Comments
Hi, I have been thinking about this feature for a while and still can't make my mind about it for several reasons: 1/ live streaming is lot to handle for a device (specially for long live). Asking more of the device could be overwhelming (specially for low-cost device) As I focus on video and audio quality more than on features, I guess I won't implement that soon. Do you plan to record on all devices? |
@ThibaultBee thank you very much for your input on the topic. The idea is to have the recorded file as a backup in case the streaming didn't go well. The recorded file can be then watched by the interested parties. So, yes, we want to record on all devices. It will be one and the same mid-range device. |
I understand the idea but I don't have time to think about it or to develop it. There are load of things I want to tackle before. And you can already do it on your own from |
Hello @ThibaultBee I'm trying to implement the logic using
|
Hi, I don't know if you have read that: https://github.com/ThibaultBee/StreamPack/blob/main/DEVELOPER_README.md Also, I have an old branch where I worked on a muxer based on MediaMuxer. See https://github.com/ThibaultBee/StreamPack/blob/experimental/mediamuxer/core/src/main/java/io/github/thibaultbee/streampack/internal/muxers/mediamuxer/MediaMuxerEndpoint.kt (Unfortunately, I do not remember the state of the MediaMuxer) |
I am also needing to support this functionality. As far as I can tell, there are a couple of limitations to this approach, IIUC:
I am taking a different approach, which is to enable multiple ICameraStreamers to share the same CameraSource, starting by making ISurfaceSource keep a list of encoder surfaces. I realize that this will result in separate simultaneous encoding and that may not work for some or all devices. |
This is exactly what I expect to implement but it is not an easy task.
It is possible to run multiple encoder of the same type at the same time. I am under the impression that every type of encoder can run multiple sessions. And as encoders come with the SoC it will work on all devices. The issue of having multiple encoding is not an encoder limitation. It is the heat. That's mostly why I haven't developed this feature. I don't like both choices. |
If there is a way to recognize when the heat-motivated throttling is
occurring, I can encourage my user to make good choices, ie buy a special
case with a fan.
…On Mon, Feb 5, 2024 at 12:46 PM Thibault Beyou ***@***.***> wrote:
I am also needing to support this functionality. As far as I can tell,
there are a couple of limitations to this approach, IIUC:
* The audio and video config must necessarily be the same for both streaming and recording. This is not desirable. In our scenarios, the ability to record is a workaround for the limited network bandwidth -- they can only stream at very limited bitrates with lower resolution, but if we can record at higher resolution, then the user can still have a high-res copy for later manipulation.
* The turning on and off of the stream and the recording must occur conjointly, whereas for our users, it is more useful to turn streaming on and off independently of each other. In particular, we allow the user to change the resolution of their RTMP stream "on-the-fly" which of course requires re-cycling the rtmp connection. This creates a gap in the livestream. We would prefer to not have a gap in the recording, if possible.
This is exactly what I expect to implement but it is not an easy task.
Recording should not suffer from a bad live stream.
I realize that this will result in separate simultaneous encoding and that
may not work for some or all devices.
It is possible to run multiple encoder of the same type at the same time.
I am under the impression that every type of encoder can run multiple
sessions. And as encoders come with the SoC it will work on all devices.
It did something like this like 4 years ago.
The issue of having multiple encoding is not an encoder limitation. It is
the heat.
Running (encoder * gpu * cpu) * 2 + modem + camera + screen will make your
phone heats a lot. To protect itself the phone will activate an internal
security mechanism called CPU throttle and the result of this is lower
performance (missing frames,...).
The problem already exists with very long live on phone.
That's mostly why I haven't developed this feature. I don't like both
choices.
—
Reply to this email directly, view it on GitHub
<#63 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABDG2KIWIFYHCQTIQBFKUODYSFALJAVCNFSM6AAAAAAUGQBCQCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRYGA3DKOJQG4>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Never tested but https://developer.android.com/reference/android/os/PowerManager.html#getCurrentThermalStatus() could help. |
After a great deal of learning and hacking trying to get it working, I have come to a reset point. There are a couple of problems, one internal, one external:
After encountering this, I tried a couple of things. The most optimistic was to re-enable the "does not have a surface" video encoder pathway, in the hopes that I could send frames to the encoder the old fashioned way, with software, as is done with audio encoding. But the CameraSource does not support getFrame, so, yeah, that won't fly. I am taking a step back, and am going to try a different approach which addresses both of the above:
I welcome your thoughts on all of the above, and thank you for all your hard work in getting StreamPack to do what it does. I will speak with my employer about making a well-deserved contribution, once we ship the android app. As a related aside, I have already shipped an app on iOS that does dual output with independent resolution settings. It uses HaisinKit.swift, which supports this directly. I certainly don't understand how it does what it does, because I did not need to modify it. Our users absolutely love the feature, which is why I am investing the time and energy in getting it working before shipping our android version. |
Hi, Wah this is a very long message. Indeed, using a lot of surfaces directly on the camera won't be compatible with a lot of devices. The I haven't comment the internal code, sorry about that. |
haha, thanks and yes, I talk alot. In the startup world, its pretty rare when a problem consumes an entire week of my life. Most PRs are, like, 2 days at most. |
@cdiddy77 Do you got any success in saving videos locally? |
Is there a way to configure the CameraRtmpLiveStreamer to write the stream to a file? Maybe exposing the
muxer
property, so we pass inwriteToFile = true
?The text was updated successfully, but these errors were encountered: