I'm experimenting a bit with a small python app that uses OpenCL to alpha blend a number of images, but I'm probably doing something wrong either with the math, or with the colorspace when loading the images.
[LEFT - the result from my code] [RIGHT - the result from Photoshop]

So, here's the kernel code
program_source = """
__kernel void alpha_blend(__global uchar* merged_image, __global uchar* image, int width, int height) {
int gid_x = get_global_id(0);
int gid_y = get_global_id(1);
if (gid_x < width && gid_y < height) {
int index = (gid_y * width + gid_x) * 4;
float alpha = (float)image[index + 3] / 255.0f;
merged_image[index] = (uchar)(image[index] * alpha + merged_image[index] * (1.0f - alpha));
merged_image[index + 1] = (uchar)(image[index + 1] * alpha + merged_image[index + 1] * (1.0f - alpha));
merged_image[index + 2] = (uchar)(image[index + 2] * alpha + merged_image[index + 2] * (1.0f - alpha));
merged_image[index + 3] = (uchar)(alpha * 255.0f + (1.0f - alpha) * merged_image[index + 3]);
}
}
"""
and here's how I load the images
image_arrays = [np.asarray(Image.open(path).convert('RGBA')) for path in image_paths]
Any help will be more than appreciated!