Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AssetWritterInput Code=-11800 "The operation could not be completed #43

Open
omarojo opened this issue Oct 22, 2015 · 2 comments
Open

Comments

@omarojo
Copy link

omarojo commented Oct 22, 2015

Im trying to substitute the CMSamplebuffer that comes out of the native camera in the method
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

with my own generated CMSampleBufferRef that I build out of a GPUImage output. But I get an error when the kickflip AssetWriterInput tries to appendBuffer

writer error Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x133a94f60 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}, NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}

//THS IS MY CODE

broadcastOutputTarget = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(1920, 1080) resultsInBGRAFormat:YES];
        [filterChain.output addTarget:broadcastOutputTarget];
        self.streamImage = [[UIImage alloc] init];
        __block ToyViewController *safeSelf = self;

        __weak GPUImageRawDataOutput *weakRawOutput = broadcastOutputTarget;
        [broadcastOutputTarget setNewFrameAvailableBlock:^{
            GLubyte *outputBytes = [weakRawOutput rawBytesForImage];
            NSInteger bytesPerRow = [weakRawOutput bytesPerRowInOutput];

            CVPixelBufferRef pixelBuffer = NULL;
            OSStatus result = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                           1920,
                                                           1080,
                                                           kCVPixelFormatType_32BGRA,
                                                           outputBytes,
                                                           bytesPerRow, nil, nil, nil,
                                                           &pixelBuffer);

            CMVideoFormatDescriptionRef videoInfo = NULL;
            result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);




            CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
            timingInfo.duration = frameDuration;
            timingInfo.presentationTimeStamp = nextPTS;

            //NSLog(@"TIME: %f", CMTimeGetSeconds(timingInfo.presentationTimeStamp));


            CMSampleBufferRef sampleBuffer = NULL;
            result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
                                                        pixelBuffer,
                                                        true, NULL, NULL,
                                                        videoInfo,
                                                        &timingInfo,
                                                        &sampleBuffer);
            NSLog(@"BUFFER : %@", sampleBuffer);
            //THIS IS WHERE I SEND THE BUFFER TO THE ENCODER
           ** [safeSelf.broadCastRecorder captureGenerateVideoOutput:sampleBuffer];**
            //Increment presentation time
            nextPTS = CMTimeAdd(frameDuration, nextPTS);
            // release the copy of the sample buffer we made
            CFRelease(sampleBuffer);
}];

And my custom captureGenerateVideoOutput:CMSampleBufferRef method that I integrated in my Recorder class. that is pretty much a clone of the KFRecorder.h/m

-(void) captureGenerateVideoOutput:(CMSampleBufferRef)sampleBuffer{
    if (!_hasScreenshot) {
        UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
        NSString *path = [self.hlsWriter.directoryPath stringByAppendingPathComponent:@"thumb.jpg"];
        NSData *imageData = UIImageJPEGRepresentation(image, 0.7);
        [imageData writeToFile:path atomically:NO];
        _hasScreenshot = YES;
    }
    if(_h264Encoder)
        [_h264Encoder encodeSampleBuffer:sampleBuffer];

}
@jonasandero
Copy link

Did you ever figure this one out? I am frustrated over having the same problem.

@omarojo
Copy link
Author

omarojo commented Feb 8, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants