Scale CMSampleBuffer Size Without Passing the Memory Limit Of Broadcast Extension

717 views Asked by At

I am developing an app that sends pixel buffers from the Broadcast Upload Extension to OpenTok. When I run my broadcast extension it hits its memory limit in seconds. I have been looking for ways to reduce the size and scale of CMSampleBuffers and ended the process by first converting them to CIImage, then scaling them, and then converting them to CVPixelBuffers for sending OpenTok Servers. Unfortunately, the extension still crashes, even though I tried to reduce the pixel buffer. My code follows:

First I convert the CMSampleBuffer to CVPixelBuffer in processSampleBuffer function from Sample Handler then pass CVPixelBuffer to my function along with timestamps. Here I convert the CVPixelBuffer to cIImage and scale it using cIFilter(CILanczosScaleTransform). After that, I generate Pixel Buffer from CIImage using PixelBufferPool and cIContext and then send the new buffer to OpenTok Servers using videoCaptureConsumer.

    func processPixelBuffer(pixelBuffer:CVPixelBuffer, timeStamp ts:CMTime) {
    
    guard  let ciImage = self.scaleFilterImage(inputImage: pixelBuffer.cmIImage, withAspectRatio: 1.0, scale: CGFloat(kVideoFrameScaleFactor)) else {return}
    

    if self.pixelBufferPool == nil ||
        self.pixelBuffer?.size != pixelBuffer.size{
        
        self.destroyPixelBuffers()
        self.updateBufferPool(newWidth: Int(ciImage.extent.size.width), newHeight: Int(ciImage.extent.size.height))
        
        guard CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, self.pixelBufferPool, &self.pixelBuffer) == kCVReturnSuccess
        else {return}
        
    }
    
    context?.render(ciImage, to:pixelBuffer)
    
    self.videoCaptureConsumer?.consumeImageBuffer(pixelBuffer,
                                                  orientation:.up,
                                                  timestamp:ts,
                                                  metadata:nil)
    
}

If the pixelBufferPool is nil or there is a change in the size of the pixelBuffer I update the pool.

 private func updateBufferPool(newWidth: Int, newHeight: Int) {
    
    let pixelBufferAttributes: [String: Any] = [
        kCVPixelBufferPixelFormatTypeKey as String: UInt(self.videoFormat),
            kCVPixelBufferWidthKey as String: newWidth,
            kCVPixelBufferHeightKey as String: newHeight,
            kCVPixelBufferIOSurfacePropertiesKey as String: [:]
        ]

    
    CVPixelBufferPoolCreate(nil,nil, pixelBufferAttributes as NSDictionary?, &pixelBufferPool)
}

This is the function I use to scale the cIImage:

func scaleFilterImage(inputImage:CIImage, withAspectRatio aspectRatio:CGFloat, scale:CGFloat) -> CIImage? {
    scaleFilter?.setValue(inputImage, forKey:kCIInputImageKey)
    scaleFilter?.setValue(scale, forKey:kCIInputScaleKey)
    scaleFilter?.setValue(aspectRatio, forKey:kCIInputAspectRatioKey)
    return scaleFilter?.outputImage
}
  • My question is why it still keeps crashing and is there another way to reduce the CVPixelBuffer size without causing a memory limit crash?

I would appreciate any help on this. Swift or Objective - C, I am open to all suggestions.

1

There are 1 answers

3
Flex Monkey On

I don't know if Core Image copies or references data when it generates a CIImage from a CVPixelBuffer. However, Accelerate's vImage library offers a no-copy solution that may solve this. The vImageBuffer_InitForCopyFromCVPixelBuffer function (combined with kvImageNoAllocate) initializes a vImage_Buffer that shares data with a CVPixelBuffer. Note that need to lock the Core Video pixel buffer with CVPixelBufferLockBaseAddress.

You can take an even more direct route by initializing a vImage_Buffer using:

let buffer = vImage_Buffer(data: data,
                           height: vImagePixelCount(height),
                           width: vImagePixelCount(width),
                           rowBytes: bytesPerRow)

Call CVPixelBufferGetBaseAddress(_:) for data, CVPixelBufferGet[Height|Width](_:) for dimensions, and CVPixelBufferGetBytesPerRow(_:) for bytesPerRow. You'll still need to lock the CVPixelBuffer.

You can create source and destination vImage buffers that reference the source and destination Core Video pixel buffers.

vImage provides scale functions for different pixel formats: https://developer.apple.com/documentation/accelerate/vimage/vimage_operations/image_scaling. The scale is the ratio between the source and destination sizes.