How to create NSImage from pixel array in Xcode for macOS app

56 views Asked by At

Developing apps for iOS, it is simple to extract pixels from UIImage, manipulate them, then reconstruct an UIImage from them, example:

csr = CGColorSpaceCreateDeviceRGB();
ctx = CGBitmapContextCreate(pax, devlar, devalt, 8, devlar*4, 
      csr,kCGImageAlphaNoneSkipLast);
rim = CGBitmapContextCreateImage(ctx);
gen = [UIImage imageWithCGImage:rim scale:1.0 orientation:UIImageOrientationUp];*

But developing a macOS app, things are different, data structures and methods are different, and I don't understand how to.

1

There are 1 answers

0
Rob On

The process is exactly the same except rather than UIImage convenience initializer imageWithCGImage:, you call NSImage initializer initWithCGImage:size::

NSImage *image;
CGImageRef cgImage = CGBitmapContextCreateImage(context);

if (cgImage) {
    NSSize size = NSMakeSize(width, height);
    image = [[NSImage alloc] initWithCGImage:cgImage size:size];

    CGImageRelease(cgImage);
}