Thursday, 19 September 2013

How does this color matrix make image black and white?

How does this color matrix make image black and white?

I am learning about image processing and saw an implementation for how to
turn an image into black and white by changing the color matrix. Below is
the implementation of such a function from Craigs Utility Library:
ColorMatrix TempMatrix = new ColorMatrix();
TempMatrix.Matrix = new float[][]{
new float[] {.3f, .3f, .3f, 0, 0},
new float[] {.59f, .59f, .59f, 0, 0},
new float[] {.11f, .11f, .11f, 0, 0},
new float[] {0, 0, 0, 1, 0},
new float[] {0, 0, 0, 0, 1}
};
Bitmap NewBitmap = TempMatrix.Apply(Image);
I know from this article that the matrix represents RGBAW (red green blue
amber and white). So, from my understanding, the colormatrix is
multiplying each of the RGB colors with the array {.3f, .59f, .11f}.
However I believe I am missing the last step.
Where do the constants .3f, .59f, and .11f come from that make an image
black and white? How does this matrix multiplication make the image black
and white?

No comments:

Post a Comment