ios - Replacing CMSampleBuffer imageBuffer with new data

435 views Asked by At

I'm working with Apple's own example here:

Applying Matte Effects to People in Images and Video

My goal is simple, I want to save the filtered video content on file as a video.

extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        //1. Grab the pixelbuffer frame from the camera output
        guard let pixelBuffer = sampleBuffer.imageBuffer else { return }

        //2. create a new CIImage from the pixelBuffer that segments a person from the background
        guard let image = processVideoFrame(pixelBuffer, sampleBuffer: sampleBuffer) else { return }

        //3. @todo (need help here) - we want to somehow create a new CMSampleBuffer from the new image / frame
        let updatedBuffer = image.convertToCMSampleBuffer()

        //4. WORKS - Save the updated sample buffer to the current recording session
        self.assetWriterHelper?.captureOutput(output, didOutput: sampleBuffer, from: connection)
    }
}

Their example doesn't show this functionality (saving the actual video), and I've tried various solutions but nothing seems to work.

Anyone have an idea how you would go about saving the modified frame/image into the current recording session?

Thank ¥ou!

1

There are 1 answers

0
Hamid Yusifli On

You can use the CIContext render(_:to:) function to render your image into a pixel buffer:

func render(
    _ image: CIImage,
    to buffer: CVPixelBuffer
)