Help Setting Up (How to verify SRT input is working) #1660
-
I want to preface that I have never used OME before so this could be a very dumb question, and I may have set it up wrong too. I am using the default configuration. I will post the Server.xml file below as well as the Docker run command I used to start OME. I'm trying to stream AAC/H264 samples from a Node.js app to OME. My goal is to get a Low Latency HLS stream out of it. The best way I could figure it out was by manually creating the TS packets myself and then piping that data into FFmpeg which then sends it to the SRT port. I keep my OME demo server as Docker Container on a remote ubuntu 20.04 machine I access over my VPN. My program doesn't crash and it doesn't give me any errors, but I have no way of knowing that OME is actually getting the live stream. I've tried navigating to https://{ip}:3334/app/test/llhls.m3u8 but I don't think I have my certs set up right (I use the Docker container Here is my Server.xml file (same for both edge and origin conf): <?xml version="1.0" encoding="UTF-8"?>
<Server version="8">
<Name>OvenMediaEngine</Name>
<!-- Host type (origin/edge) -->
<Type>origin</Type>
<!-- Specify IP address to bind (* means all IPs) -->
<IP>*</IP>
<PrivacyProtection>false</PrivacyProtection>
<!--
To get the public IP address(mapped address of stun) of the local server.
This is useful when OME cannot obtain a public IP from an interface, such as AWS or docker environment.
If this is successful, you can use ${PublicIP} in your settings.
-->
<StunServer>stun.l.google.com:19302</StunServer>
<Modules>
<!--
Currently OME only supports h2 like all browsers do. Therefore, HTTP/2 only works on TLS ports.
-->
<HTTP2>
<Enable>true</Enable>
</HTTP2>
<LLHLS>
<Enable>true</Enable>
</LLHLS>
<!-- P2P works only in WebRTC and is experiment feature -->
<P2P>
<!-- disabled by default -->
<Enable>false</Enable>
<MaxClientPeersPerHostPeer>2</MaxClientPeersPerHostPeer>
</P2P>
</Modules>
<!-- Settings for the ports to bind -->
<Bind>
<!-- Enable this configuration if you want to use API Server -->
<!--
<Managers>
<API>
<Port>8081</Port>
<TLSPort>8082</TLSPort>
<WorkerCount>1</WorkerCount>
</API>
</Managers>
-->
<Providers>
<!-- Pull providers -->
<RTSPC>
<WorkerCount>1</WorkerCount>
</RTSPC>
<OVT>
<WorkerCount>1</WorkerCount>
</OVT>
<!-- Push providers -->
<RTMP>
<Port>1935</Port>
<WorkerCount>1</WorkerCount>
</RTMP>
<SRT>
<Port>9999</Port>
<WorkerCount>1</WorkerCount>
</SRT>
<MPEGTS>
<!--
Listen on port 4000~4005 (<Port>4000-4004,4005/udp</Port>)
This is just a demonstration to show that you can configure the port in several ways
-->
<Port>4000/udp</Port>
</MPEGTS>
<WebRTC>
<Signalling>
<Port>3333</Port>
<TLSPort>3334</TLSPort>
<WorkerCount>1</WorkerCount>
</Signalling>
<IceCandidates>
<IceCandidate>*:10000/udp</IceCandidate>
<!--
If you want to stream WebRTC over TCP, specify IP:Port for TURN server.
This uses the TURN protocol, which delivers the stream from the built-in TURN server to the player's TURN client over TCP.
For detailed information, refer https://airensoft.gitbook.io/ovenmediaengine/streaming/webrtc-publishing#webrtc-over-tcp
-->
<TcpRelay>*:3478</TcpRelay>
<!-- TcpForce is an option to force the use of TCP rather than UDP in WebRTC streaming. (You can omit ?transport=tcp accordingly.) If <TcpRelay> is not set, playback may fail. -->
<TcpForce>true</TcpForce>
<TcpRelayWorkerCount>1</TcpRelayWorkerCount>
</IceCandidates>
</WebRTC>
</Providers>
<Publishers>
<OVT>
<Port>9000</Port>
<WorkerCount>1</WorkerCount>
</OVT>
<LLHLS>
<!--
OME only supports h2, so LLHLS works over HTTP/1.1 on non-TLS ports.
LLHLS works with higher performance over HTTP/2,
so it is recommended to use a TLS port.
-->
<Port>3333</Port>
<!-- If you want to use TLS, specify the TLS port -->
<TLSPort>3334</TLSPort>
<WorkerCount>1</WorkerCount>
</LLHLS>
<WebRTC>
<Signalling>
<Port>3333</Port>
<TLSPort>3334</TLSPort>
<WorkerCount>1</WorkerCount>
</Signalling>
<IceCandidates>
<IceCandidate>*:10000-10005/udp</IceCandidate>
<!--
If you want to stream WebRTC over TCP, specify IP:Port for TURN server.
This uses the TURN protocol, which delivers the stream from the built-in TURN server to the player's TURN client over TCP.
For detailed information, refer https://airensoft.gitbook.io/ovenmediaengine/streaming/webrtc-publishing#webrtc-over-tcp
-->
<TcpRelay>*:3478</TcpRelay>
<!-- TcpForce is an option to force the use of TCP rather than UDP in WebRTC streaming. (You can omit ?transport=tcp accordingly.) If <TcpRelay> is not set, playback may fail. -->
<TcpForce>true</TcpForce>
<TcpRelayWorkerCount>1</TcpRelayWorkerCount>
</IceCandidates>
</WebRTC>
</Publishers>
</Bind>
<!--
Enable this configuration if you want to use API Server
<AccessToken> is a token for authentication, and when you invoke the API, you must put "Basic base64encode(<AccessToken>)" in the "Authorization" header of HTTP request.
For example, if you set <AccessToken> to "ome-access-token", you must set "Basic b21lLWFjY2Vzcy10b2tlbg==" in the "Authorization" header.
-->
<!--
<Managers>
<Host>
<Names>
<Name>*</Name>
</Names>
<TLS>
<CertPath>path/to/file.crt</CertPath>
<KeyPath>path/to/file.key</KeyPath>
<ChainCertPath>path/to/file.crt</ChainCertPath>
</TLS>
</Host>
<API>
<AccessToken>ome-access-token</AccessToken>
<CrossDomains>
<Url>*.airensoft.com</Url>
<Url>http://*.sub-domain.airensoft.com</Url>
<Url>http?://airensoft.*</Url>
</CrossDomains>
</API>
</Managers>
-->
<VirtualHosts>
<!-- You can use wildcard like this to include multiple XMLs -->
<VirtualHost include="VHost*.xml" />
<VirtualHost>
<Name>default</Name>
<!--Distribution is a value that can be used when grouping the same vhost distributed across multiple servers. This value is output to the events log, so you can use it to aggregate statistics. -->
<Distribution>ovenmediaengine.com</Distribution>
<!-- Settings for multi ip/domain and TLS -->
<Host>
<Names>
<!-- Host names
<Name>stream1.airensoft.com</Name>
<Name>stream2.airensoft.com</Name>
<Name>*.sub.airensoft.com</Name>
<Name>192.168.0.1</Name>
-->
<Name>*</Name>
</Names>
<!--
<TLS>
<CertPath>path/to/file.crt</CertPath>
<KeyPath>path/to/file.key</KeyPath>
<ChainCertPath>path/to/file.crt</ChainCertPath>
</TLS>
-->
</Host>
<!--
Refer https://airensoft.gitbook.io/ovenmediaengine/signedpolicy
<SignedPolicy>
<PolicyQueryKeyName>policy</PolicyQueryKeyName>
<SignatureQueryKeyName>signature</SignatureQueryKeyName>
<SecretKey>aKq#1kj</SecretKey>
<Enables>
<Providers>rtmp,webrtc,srt</Providers>
<Publishers>webrtc,hls,llhls,dash,lldash</Publishers>
</Enables>
</SignedPolicy>
-->
<!--
<AdmissionWebhooks>
<ControlServerUrl></ControlServerUrl>
<SecretKey></SecretKey>
<Timeout>3000</Timeout>
<Enables>
<Providers>rtmp,webrtc,srt</Providers>
<Publishers>webrtc,hls,llhls,dash,lldash</Publishers>
</Enables>
</AdmissionWebhooks>
-->
<!-- <Origins>
<Properties>
<NoInputFailoverTimeout>3000</NoInputFailoverTimeout>
<UnusedStreamDeletionTimeout>60000</UnusedStreamDeletionTimeout>
</Properties>
<Origin>
<Location>/app/stream</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/stream_720p</Url></Urls>
</Pass>
<ForwardQueryParams>false</ForwardQueryParams>
</Origin>
<Origin>
<Location>/app/</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/</Url></Urls>
</Pass>
</Origin>
<Origin>
<Location>/edge/</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/</Url></Urls>
</Pass>
</Origin>
</Origins> -->
<!-- Settings for applications -->
<Applications>
<Application>
<Name>app</Name>
<!-- Application type (live/vod) -->
<Type>live</Type>
<OutputProfiles>
<!-- Enable this configuration if you want to hardware acceleration using GPU -->
<HardwareAcceleration>false</HardwareAcceleration>
<OutputProfile>
<Name>bypass_stream</Name>
<OutputStreamName>${OriginStreamName}</OutputStreamName>
<Encodes>
<Audio>
<Bypass>true</Bypass>
</Audio>
<Video>
<Bypass>true</Bypass>
</Video>
<Audio>
<Codec>opus</Codec>
<Bitrate>128000</Bitrate>
<Samplerate>48000</Samplerate>
<Channel>2</Channel>
</Audio>
<!--
<Video>
<Codec>vp8</Codec>
<Bitrate>1024000</Bitrate>
<Framerate>30</Framerate>
<Width>1280</Width>
<Height>720</Height>
<Preset>faster</Preset>
</Video>
-->
</Encodes>
</OutputProfile>
</OutputProfiles>
<Providers>
<OVT />
<WebRTC />
<RTMP />
<SRT />
<MPEGTS>
<StreamMap>
<!--
Set the stream name of the client connected to the port to "stream_${Port}"
For example, if a client connects to port 4000, OME creates a "stream_4000" stream
<Stream>
<Name>stream_${Port}</Name>
<Port>4000,4001-4004</Port>
</Stream>
<Stream>
<Name>stream_4005</Name>
<Port>4005</Port>
</Stream>
-->
<Stream>
<Name>stream_${Port}</Name>
<Port>4000</Port>
</Stream>
</StreamMap>
</MPEGTS>
<RTSPPull />
<WebRTC>
<Timeout>30000</Timeout>
</WebRTC>
</Providers>
<Publishers>
<AppWorkerCount>1</AppWorkerCount>
<StreamWorkerCount>8</StreamWorkerCount>
<OVT />
<WebRTC>
<Timeout>30000</Timeout>
<Rtx>false</Rtx>
<Ulpfec>false</Ulpfec>
<JitterBuffer>false</JitterBuffer>
</WebRTC>
<LLHLS>
<ChunkDuration>0.2</ChunkDuration>
<SegmentDuration>6</SegmentDuration>
<SegmentCount>10</SegmentCount>
<CrossDomains>
<Url>*</Url>
</CrossDomains>
</LLHLS>
</Publishers>
</Application>
</Applications>
</VirtualHost>
</VirtualHosts>
</Server> Here is my docker run command (shell script): sudo docker rm -f oven-media-engine
HOST_IP=$(curl ipv4.icanhazip.com)
docker run -d --name oven-media-engine \
-v ~/OvenMediaEngine/mounts/logs/:/var/log/ovenmediaengine/ \
-v ~/OvenMediaEngine/mounts/ome-origin-conf/:/opt/ovenmediaengine/bin/origin_conf/ \
-v ~/OvenMediaEngine/mounts/ome-edge-conf/:/opt/ovenmediaengine/bin/edge_conf/ \
-v ~/nginx/sslconf/certs/:/cert/ \
--network nginx-net \
-e LETSENCRYPT_HOST=ome.{mydomain}.com,www.ome.{mydomain}.com \
-e VIRTUAL_PORT=3334 \
-e OME_HOST_IP=$HOST_IP \
--expose 3333 \
--expose 3334 \
--expose 9999 \
-p 1935:1935 \
-p 9999:9999 \
-p 9999:9999/udp \
-p 9000:9000 \
-p 3333:3333 \
-p 3334:3334 \
-p 3478:3478 \
-p 10000-10009:10000-10009/udp \
airensoft/ovenmediaengine:latest Here is my ffmpeg command (spawned and stdin piped from Node.js):
I would appreciate any help. I've been down the depths of live streaming for 6 months now and I think this might be the cleanest solution I could make to achieve LL-HLS from raw audio/video samples. (Every piece of documentation uses a file like .mp4 as input or RTSP server, etc. I require a solution that works explicitly on just raw AAC/H264 audio video samples.) I also apologize in advance if this question should be posted elsewhere. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 15 replies
-
You are doing great. Victory is near! For TLS you do not seem to have any certs defined in the Server.xml file. Check your configuration and un-comment/add those certs, or if you are using a proxy, forward to the regular port rather than the TLS port. Also be sure that OME has permission to actually read the cert files. For the SRT -> HLS stream conversion, I would first do a sanity check by using OBS to stream SRT to the OME instance. Try playing back LLHLS to see if it is working. Then try with your solution. With either outcome we will get more clues from the resulting OME log file if you can please provide that. You may find it useful to enable a more verbose log level by modifying the Logger.xml file. Good luck and keep us posted! |
Beta Was this translation helpful? Give feedback.
-
Your construction of the fetch call doesn't seem to actually include the headers you've specified. Give that a check? (i.e. headers: headers)
… On Jul 15, 2024, at 4:17 PM, traviszuleger ***@***.***> wrote:
I might have found a different issue. Specifically with Node.js, all requests to Oven Media Engine's API results in an Unauthorized error. Specifically, OME isn't seeing the Authorization Header that I am sending.
/**
* @param {string} virtualHost
* @param {string} applicationName
* @param {string} accessToken
* @param {boolean} tls
*/
function OvenMediaEngineAPI(virtualHost, applicationName, accessToken=`ome-access-token`, tls=false) {
const baseURL = tls ? SSL_URL : URL;
return {
/**
*
* @param {POST_MultiplexChannelCreateBody} body
* Body to send to Oven Media Engine.
* Reference the virtual host or the application name by using `{{VHOST}}` or `{{APP}}` (case insensitive)
* @returns
*/
async createMultiplexChannel(body) {
const jsonBody = JSON.stringify(body)
.replace(/\{\{VHOST\}\}/gi, virtualHost)
.replace(/\{\{APP\}\}/gi, applicationName);
const authHeader = `Basic ${Buffer.from(accessToken).toString('base64')}`;
const url = `${baseURL}/v1/vhosts/${virtualHost}/apps/${applicationName}/multiplexChannels`;
console.log(`POST ${url} { Authorization: ${authHeader } }`);
const headers = new Headers();
headers.append("Content-Type", "application/json");
headers.append("Authorization", authHeader);
const res = await fetch(url, {
body: jsonBody,
method: `POST`,
headers
});
if(!res.ok) {
const body = await res.text();
console.log(`${res.status}: ${res.statusText} - ${body}`);
switch(res.status) {
case 400: throw new Error(`Bad Request.`);
case 401: throw new Error(`Unauthorized`);
case 404: throw new Error(`Not Found`);
case 409: throw new Error(`Conflict`);
case 500: throw new Error(`Internal Server Error`);
case 502: throw new Error(`Bad Gateway`);
default: throw new Error(`${res.status}: ${res.statusText}`);
}
}
return await res.json();
},
async getAllMultiplexChannels() {
}
}
}
The above code results in this:
401: Unauthorized - {"message":"[HTTP] Authorization header is required to call API (401)","statusCode":401}
If I make the same exact request using Postman, I get a 200 OK response. It's either Node.js handles these Authorization headers in a weird way (that I cannot find on the internet as to why) or I could be missing something. I just don't see why the same HTTP request works on Postman, but not Node.js
Any thoughts?
—
Reply to this email directly, view it on GitHub <#1660 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AIAMIQVR5OQR7NU26YNVNDLZMQU67AVCNFSM6AAAAABKZDYDZOVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTAMBVGUYTSNA>.
You are receiving this because you commented.
|
Beta Was this translation helpful? Give feedback.
-
@bchah Hello again, it's been a while. I had to focus on different aspects of the project with our current implementation and we've come across another roadblock-- I proposed I re-visit Oven Media Engine again... I still have a couple questions. I am running this container with
The project I am working on is in Node.js and I have gotten it to a point where it calls the API and is able to create the mux channel. In my tests, I create a mux channel for After that, the server receives AAC and H264 packets and pipes that data into their respective Two issues:
I hope these questions aren't too much, I am just hoping you know something or you know another contributor who may know more on what I can do to get this to work. |
Beta Was this translation helpful? Give feedback.
You are doing great. Victory is near!
For TLS you do not seem to have any certs defined in the Server.xml file. Check your configuration and un-comment/add those certs, or if you are using a proxy, forward to the regular port rather than the TLS port. Also be sure that OME has permission to actually read the cert files.
For the SRT -> HLS stream conversion, I would first do a sanity check by using OBS to stream SRT to the OME instance. Try playing back LLHLS to see if it is working. Then try with your solution. With either outcome we will get more clues from the resulting OME log file if you can please provide that. You may find it useful to enable a more verbose log level by modifying th…