Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash on incoming audio (0 Hz setting) #577

Open
denis-obukhov opened this issue Feb 5, 2025 · 7 comments
Open

Crash on incoming audio (0 Hz setting) #577

denis-obukhov opened this issue Feb 5, 2025 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@denis-obukhov
Copy link

denis-obukhov commented Feb 5, 2025

Describe the bug
Sometimes when I subscribe on an remote publication I get the crash:

           AURemoteIO.cpp:1091  failed: 1701737535 (enable 2, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>)
           AURemoteIO.cpp:1091  failed: 1701737535 (enable 2, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>)
           AVAEInternal.h:71    required condition is false: [AVAudioEngineGraph.mm:2161:_Connect: (IsFormatSampleRateAndChannelCountValid(format))]
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)'
*** First throw call stack:
(0x19131a5fc 0x18e8b1244 0x1914676e0 0x1abf7ec50 0x1ac0419a0 0x1ac045964 0x10706a490 0x1070671fc 0x107066bec 0x1071cb58c 0x1071472b0 0x106e6ee20 0x106e6f244 0x106e6c24c 0x106eeb958 0x106f7743c 0x106f76334 0x106f779a8 0x21bffe7d0 0x21bffe480)
libc++abi: terminating due to uncaught exception of type NSException

Both listener and streamer are my local iOS devices connected to Xcode. I stream opus audio only.

As far as I understand this is because of 0 Hz audio settings.

SDK Version
v2.1.0
LiveKit Cloud

iOS/macOS Version
iOS 18.3

Xcode Version
Version 16.2 (16B5100e)

Steps to Reproduce
It happens randomly. I've never seen it before. I used an older version of the SDK (2.0.8). Probably, moving from background <-> foreground affects this.
What I'm doing is basically this:

public func room(_ room: Room, participant: RemoteParticipant, didPublishTrack publication: RemoteTrackPublication) {
//... some logic
try await publication.set(subscribed: true)
}

Expected behavior
No crash

Image
@denis-obukhov denis-obukhov added the bug Something isn't working label Feb 5, 2025
@hiroshihorie
Copy link
Member

Thanks for the report, will investigate this one today.
Please let me know if you notice any other clues to reproduce.

@hiroshihorie
Copy link
Member

Hmm I can't reproduce it, but I have seen this when the AudioSession config was not correct.
Do you do any custom audio session category modification ?

@denis-obukhov
Copy link
Author

@hiroshihorie Yeah. I have this modified config initially copied from v2.0.8. Now, I see that it has been changed inside the SDK.

      AudioManager.shared.customConfigureAudioSessionFunc = { newState, oldState in
            DispatchQueue.liveKitWebRTC.async { [weak self] in
                guard let self else { return }
                
                // prepare config
                let configuration = LKRTCAudioSessionConfiguration.webRTC()
                configuration.category = AVAudioSession.Category.playAndRecord.rawValue
                configuration.mode = AVAudioSession.Mode.voiceChat.rawValue
                configuration.categoryOptions = [
                    .allowBluetooth,
                    .duckOthers,
                    .defaultToSpeaker
                ]
                
                var setActive: Bool?
                
                if newState.trackState != .none, oldState.trackState == .none {
                    // activate audio session when there is any local/remote audio track
                    setActive = true
                } else if newState.trackState == .none, oldState.trackState != .none {
                    // deactivate audio session when there are no more local/remote audio tracks
                    setActive = false
                }
                
                // configure session
                let session = LKRTCAudioSession.sharedInstance()
                session.lockForConfiguration()
                // always unlock
                defer { session.unlockForConfiguration() }
                
                do {
                    if let setActive {
                        try session.setConfiguration(configuration, active: setActive)
                    } else {
                        try session.setConfiguration(configuration)
                    }
                } catch {
                    self.print("Failed to configure audio session with error: \(error)")
                }
            }
        }

@hiroshihorie
Copy link
Member

hiroshihorie commented Feb 6, 2025

Ok that won't work, it must be sync not async.
Do you need the custom func anyways ? if not I recommend to not set custom func and let the SDK handle it automatically.

The new audio engine will now precisely invoke the method right before it starts,
so by the time the func returns it must be already configured.

It's too late if it's async and the engine will fail to start.

@denis-obukhov
Copy link
Author

denis-obukhov commented Feb 6, 2025

@hiroshihorie Got it! Thanks. But I do need to always have playAndRecord category in order to work correctly with Apple Push To Talk framework I use along with LiveKit. Moreover, Apple says that I don't need to activate the audio session on my own. The framework does it automatically by calling func channelManager(_ channelManager:, didActivate audioSession:)`

@hiroshihorie
Copy link
Member

hiroshihorie commented Feb 6, 2025

If you simply want playAndRecord all the time:

  1. Disable SDKs automatic configuration by calling : AudioManager.shared.set(engineObservers:[]) . (custom config func will not be invoked also)
  2. Simply configure your audio session at app startup :
    let session = AVAudioSession.sharedInstance()
    try session.setCategory(.playAndRecord, mode: .voiceChat)
    try session.setActive(true)

Would that work for you ?

@denis-obukhov
Copy link
Author

Not really. Meanwhile, it works fine for me on v2.0.19 using the customConfigureAudioSessionFunc approach.

Moreover, having AudioManager.shared.set(engineObservers: []) and activating the audio session on the launch I've just encountered this crash again, but this time on an iPad Air (5th Gen) running iOS 18.2.1, without switching to the background state—just a freshly launched app with Xcode connected. It reproduces every time, right after subscribing on an incoming publication.

The crash:

           AURemoteIO.cpp:1091  failed: 1701737535 (enable 2, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>)
           AURemoteIO.cpp:1091  failed: 1701737535 (enable 2, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>)
           AVAEInternal.h:71    required condition is false: [AVAudioEngineGraph.mm:2161:_Connect: (IsFormatSampleRateAndChannelCountValid(format))]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants