You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The project serves as a research project, seeking to understand the tangible synergies available between protocols / projects / infrastructure components, running (ultimately) on Ethereum as their base layer.
Much has been made of the benefits of co-locating contracts (i.e. the root logic, the core axioms, the DNA) of a protocol, on a single paradigm of blockchain network (e.g. EVM), sometimes known as "protocol-level composability".
It remains to be seen however, whether this inevitably weighs too heavily somewhere in such a system, or whether such co-location might actually empower that system.
A concept is also presented to you, for your entertainment: one of the "auto-archiving livestreaming server". Which is capable of 1) receiving livestreaming content (ultimately from a camera and microphone), 2) serving livestreaming content to happy viewers, 3) automatically automagically archiving absolutely everything (if enabled, default=disabled)
The hack idea then, because this is, ultimately a hack project, is to hack go-livepeer, so that it can archive content segments to Swarm. All these segments of video are flying around livepeer network, through -broadcaster, -orchestrator and -transcoder nodes - there's gotta be a way to get these stored somehow.
Motivation
Livepeer is streaming, Swarm is storage, both run on Ethereum - the synergies ought to be obvious.When people think about livestreaming, on Twitch or YouTube Live or Facebook Live or wherever, it is almost expected that it will be stored. Livepeer being a livestreaming platform means that it does just that: livestream... but not store. The storage must be provided via a separate infrastructural building block: one which is optimised for storing content. Livestreaming on Livepeer is ephemeral from the ground up. The "impermanent web" if you will. And what it could benefit from is a sibling to stand by and "permanentise" content whenever requested.
A big motivation is also to try to make this a crypto-only system. A "closed loop" consisting of partnering service providers, all paying each other using the same system.:
Broadcasters paying Orchestrators using Probabilistic Micropayments
Broadcasters paying "Bees" using storage-incentives
...all cleared on EVM-compatible networks, without a cent or a credit card in sight.
Another motivation I have, is for the sake of the end-user. That underserved soul! The person performing the content, the person watching the content. You've got to imagine that if you make them have to maintain 2 (two) different addresses / keypairs / sets of words / credit cards, to maintain their onchain identities and make onchain payments, then this would be precisely 100% more complicated than if they must maintain only 1 (one) identity.
So, effectively aim to create a closed-loop crypto-only base infrastructure for the creation of a new media platform, where consumers can pay creators, with the infrastructure operators in EVM-based protocols also earning along the way. A crypto win-win-win.
Proposed Solution
There's got to be a way to tap into this somewhere elegant, and insert some lines of code either:
a) in the -broadcaster process, where the .ts. segments are written out to disk into ~/.lpData, add lines to either:
i) squirt a .ts file into an endpoint served by a co-located bee server - integrating with whatever API it provides (clean integration, no AWS-type adapter), but just a POST-like fire-and-forget. Such an architectural approach could be ported to MistServer as-and-when.
ii) import appropriate code from https://github.com/ethersphere/bee and run some kind of bee-light server as part of go-livepeer and call whatever "add to Swarm" function exists. This could be a pure golang implementation.
or to also include any transcoded renditions:
b) in the -broadcaster process, where a .m3u8 HLS stream is served, push that content using either i) or ii) above.
As for the economics - we still need to find a way for the livepeer -broadcaster to pay for the "postage stamps" so that the bee server will store a .ts segment).
go-livepeer already has an RPC endpoint onto Arbitrum. With some kind of cross-chain bridge thing (see this), it ought to be possible to find a way for the B to elegantly auto-pay for auto-archiving, by signing it's own transactions to "please store this". Definitely an elegant and neat implementation. Of course, if both protocols were on the same network, this part ought to be immensely trivial...
Implementation Tasks and Considerations
Find the places in go-livepeer -broadcaster code where:
.ts files, representing segments of source video, are written to disk
.ts files, representing source, plus any transcoded segments of video, are served on a web-server
Add code to put into Swarm, however you think is easiest.
Testing Tasks and Considerations
Some things about starting streaming / stopping streaming a lot, to make sure there's no memory getting leaked, with threads (or whatever) not getting shut down / gc'd. And that keepalive stuff is intelligent to handle time-outs etc.
Then there's of course the load... the elephant in the room... how much content can be pumped in to a Swarm endpoint in terms of bytes? Suggest to start with 144p 10fps - as low a bitrate as possible, and see how it goes. If it goes well, gradually increase the bitrate.
Known Unknowns
It is unknown whether anyone else thinks this is a good idea to try, or whether it's only Chris that thinks this.
Alternatives
As a species, we just keep using YouTube, and perhaps gradually migrate to Twitch when YouTube become too censor-happy.
Additional Context
I've tried to come at this from multiple different angles, but this one feels most natural so far. Something about how go-livepeer and bee are both in golang makes me think it's the right idea. Both are also chain aware.
Similar concepts may be possible if / when MistServer also has a connection to an RPC endpoint on Arbitrum - and in some ways, could be a nice "first lift" integration, perhaps even before integrating with Orchestrators for Transcoding WDYT @Thulinma?
But for other ways to apply pressure to this union, I've got these for you, all in one place:
chrishobcroft
changed the title
Research Project / Concept / Hack Project: Benefits of being on EVM / Live Content Archiving / auto-archiving livestreaming server
Research Project / Concept / Hack Project: Benefits of being on EVM / "auto-archiving livestreaming server"
Jul 13, 2022
Hi @chrishobcroft , we're now trying clean up some GH Issues. This one seems like a big chunk of research / work. So, closing this one. Feel free to post it as a Livepeer Forum post (if you still find if valid). Thanks!
Abstract
The project serves as a research project, seeking to understand the tangible synergies available between protocols / projects / infrastructure components, running (ultimately) on Ethereum as their base layer.
Much has been made of the benefits of co-locating contracts (i.e. the root logic, the core axioms, the DNA) of a protocol, on a single paradigm of blockchain network (e.g. EVM), sometimes known as "protocol-level composability".
It remains to be seen however, whether this inevitably weighs too heavily somewhere in such a system, or whether such co-location might actually empower that system.
A concept is also presented to you, for your entertainment: one of the "auto-archiving livestreaming server". Which is capable of 1) receiving livestreaming content (ultimately from a camera and microphone), 2) serving livestreaming content to happy viewers, 3) automatically automagically archiving absolutely everything (if enabled, default=disabled)
The hack idea then, because this is, ultimately a hack project, is to hack
go-livepeer
, so that it can archive content segments to Swarm. All these segments of video are flying around livepeer network, through-broadcaster
,-orchestrator
and-transcoder
nodes - there's gotta be a way to get these stored somehow.Motivation
Livepeer is streaming, Swarm is storage, both run on Ethereum - the synergies ought to be obvious.When people think about livestreaming, on Twitch or YouTube Live or Facebook Live or wherever, it is almost expected that it will be stored. Livepeer being a livestreaming platform means that it does just that: livestream... but not store. The storage must be provided via a separate infrastructural building block: one which is optimised for storing content. Livestreaming on Livepeer is ephemeral from the ground up. The "impermanent web" if you will. And what it could benefit from is a sibling to stand by and "permanentise" content whenever requested.
A big motivation is also to try to make this a crypto-only system. A "closed loop" consisting of partnering service providers, all paying each other using the same system.:
storage-incentives
...all cleared on EVM-compatible networks, without a cent or a credit card in sight.
Another motivation I have, is for the sake of the end-user. That underserved soul! The person performing the content, the person watching the content. You've got to imagine that if you make them have to maintain 2 (two) different addresses / keypairs / sets of words / credit cards, to maintain their onchain identities and make onchain payments, then this would be precisely 100% more complicated than if they must maintain only 1 (one) identity.
So, effectively aim to create a closed-loop crypto-only base infrastructure for the creation of a new media platform, where consumers can pay creators, with the infrastructure operators in EVM-based protocols also earning along the way. A crypto win-win-win.
Proposed Solution
There's got to be a way to tap into this somewhere elegant, and insert some lines of code either:
a) in the
-broadcaster
process, where the.ts.
segments are written out to disk into~/.lpData
, add lines to either:i) squirt a
.ts
file into an endpoint served by a co-located bee server - integrating with whatever API it provides (clean integration, no AWS-type adapter), but just a POST-like fire-and-forget. Such an architectural approach could be ported to MistServer as-and-when.ii) import appropriate code from https://github.com/ethersphere/bee and run some kind of bee-light server as part of
go-livepeer
and call whatever "add to Swarm" function exists. This could be a pure golang implementation.or to also include any transcoded renditions:
b) in the
-broadcaster
process, where a.m3u8
HLS stream is served, push that content using either i) or ii) above.As for the economics - we still need to find a way for the
livepeer -broadcaster
to pay for the "postage stamps" so that thebee
server will store a.ts
segment).go-livepeer
already has an RPC endpoint onto Arbitrum. With some kind of cross-chain bridge thing (see this), it ought to be possible to find a way for the B to elegantly auto-pay for auto-archiving, by signing it's own transactions to "please store this". Definitely an elegant and neat implementation. Of course, if both protocols were on the same network, this part ought to be immensely trivial...Implementation Tasks and Considerations
go-livepeer -broadcaster
code where:.ts
files, representing segments of source video, are written to disk.ts
files, representing source, plus any transcoded segments of video, are served on a web-serverTesting Tasks and Considerations
Some things about starting streaming / stopping streaming a lot, to make sure there's no memory getting leaked, with threads (or whatever) not getting shut down / gc'd. And that keepalive stuff is intelligent to handle time-outs etc.
Then there's of course the load... the elephant in the room... how much content can be pumped in to a Swarm endpoint in terms of bytes? Suggest to start with 144p 10fps - as low a bitrate as possible, and see how it goes. If it goes well, gradually increase the bitrate.
Known Unknowns
It is unknown whether anyone else thinks this is a good idea to try, or whether it's only Chris that thinks this.
Alternatives
As a species, we just keep using YouTube, and perhaps gradually migrate to Twitch when YouTube become too censor-happy.
Additional Context
I've tried to come at this from multiple different angles, but this one feels most natural so far. Something about how
go-livepeer
andbee
are both ingolang
makes me think it's the right idea. Both are also chain aware.Similar concepts may be possible if / when MistServer also has a connection to an RPC endpoint on Arbitrum - and in some ways, could be a nice "first lift" integration, perhaps even before integrating with Orchestrators for Transcoding WDYT @Thulinma?
But for other ways to apply pressure to this union, I've got these for you, all in one place:
ethersphere/bee#3042
ethersphere/storage-incentives#26
ethersphere/beekeeper#264
The text was updated successfully, but these errors were encountered: