Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Include AVPlayer Audio in SCNRecorder Recordings? #67

Open
giomurru opened this issue Nov 16, 2024 · 1 comment
Open

How to Include AVPlayer Audio in SCNRecorder Recordings? #67

giomurru opened this issue Nov 16, 2024 · 1 comment

Comments

@giomurru
Copy link

Hi there,

I'm encountering an issue with SCNRecorder while trying to record an SCNView. The view is displaying a video that is mapped as a texture. The video is played using an AVPlayer, and its visuals and audio are perfectly synchronized when displayed.

However, when I use SCNRecorder to record the scene, only the video is captured — the audio is missing.

From my exploration, I noticed that SCNRecorder includes an AudioEngine class for playing soundtracks. However, this approach seems tailored to a custom player that doesn't support video playback. Additionally, trying to synchronize a separate audio playback manually with the video being rendered by AVPlayer feels overly complex and prone to issues.

I'm wondering... is it feasible to extend SCNRecorder to initialize audio recording from an AVPlayer instance that is already in use?

Any insights, suggestions, or pointers would be greatly appreciated!

@v-grigoriev
Copy link
Collaborator

v-grigoriev commented Nov 23, 2024

Hi @giomurru, I believe this can be done by slightly modifying SCNRecroder.

In general, we can't do anything with AVPlayer itself.

I see that SCNView provides us with audioEngine reference.

If SCNView uses the engine to play any audio you can try attaching to it.
So you can try the code below modifying SCNRecorder according to comments I made.

    sceneView.prepareForRecording()

    do {
      guard let recorder = sceneView.recorder else { return }
      recorder.useAudioEngine = true

      // You need to make audioFormat public to access it
      recorder.audioInput.audioFormat = /* determ your format */

      sceneView.audioEngine.mainMixerNode.installTap(
        onBus: 0,
        bufferSize: 4096,
        format: nil
      ) { [weak recorder] (buffer, time) in
        guard let recorder else { return }

        do {
          let sampleBuffer = try Self.createAudioSampleBuffer(from: buffer, time: time)

          // You need to define new audioEngine(_: AVAudioEngine, didOutputAudioSampleBuffer: CMSampleBuffer) handler.
          recorder.audioInput.audioEngine(sceneView.audioEngine, didOutputAudioSampleBuffer: sampleBuffer)
        }
        catch {
          // Catch errors on your own.
        }
      }
    }
    
static func createAudioSampleBuffer(from buffer: AVAudioPCMBuffer, time: AVAudioTime) throws -> CMSampleBuffer {
    let audioBufferList = buffer.mutableAudioBufferList
    let streamDescription = buffer.format.streamDescription.pointee
    let timescale = CMTimeScale(streamDescription.mSampleRate)
    let format = try CMAudioFormatDescription(audioStreamBasicDescription: streamDescription)
    let sampleBuffer = try CMSampleBuffer(
      dataBuffer: nil,
      formatDescription: format,
      numSamples: CMItemCount(buffer.frameLength),
      sampleTimings: [
        CMSampleTimingInfo(
          duration: CMTime(value: 1, timescale: timescale),
          presentationTimeStamp: CMTime(
            seconds: AVAudioTime.seconds(forHostTime: time.hostTime),
            preferredTimescale: timescale
          ),
          decodeTimeStamp: .invalid
        )
      ],
      sampleSizes: []
    )
    try sampleBuffer.setDataBuffer(fromAudioBufferList: audioBufferList)
    return sampleBuffer
  }

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants