Reg color comparison between image and reference spectrum

42 views (last 30 days)
Hello All,
I need some advice on comparing colors from an image to a reference spectrum with the ultimate goal of convert the fringe colors ->thickness information.
What I have so far:
1. Using thin film equations, I am able to generate a spectrum which relates color to thickness. Xaxis is in nanometers. File =
2. Generate decent experimental image that shows a similar fringe pattern. File =
To match the color fringes, I wrote a script that does the following:
1. Scan RGB image. Store all (X,Y){RGB} info 2. For each pixel, calculate Lab values. 3. Find the difference (deltaE) of each of these Lab values with my reference spectrum that I have generated to find the closest match.
The problem I am facing is that although the match is not always good. For e.g., in my reference spectrum, at 500nm, the Lab value is (18,29,-29) for purple. However, in my RGB image, the same color is (34,48,-42) which leads to a deltaE of 28.
Since my overall goal is to analyze a video sequence showing changes in thin-film, using k-means to segment the colors doesn't work as well, as the number of colors change in the video sequence.
Any advice/help would be appreciated.
Thanks,

Accepted Answer

Image Analyst
Image Analyst on 24 Mar 2014
I don't understand how you're calibrating this. It seems like you used some analytical equations to generate some theoretical colors. How do you know they would match some actual real world image?
Now let's assume you snap an image of some colored interference fringes. Who's to say that the color you observe means that the thickness of the film is what it is from your synthetic calibration guide image? For example, let's say that you snap an image of a film that is 140 nm thick. According to your "calibration" image, is should be blue. I also see blue at 330 and 540 nm. But have you taken into account the spectral emittance of the light source when creating your calibration scale? And have you taken into account the spectral responsivities of the camera? If you're going to use purely theoretical equations rather than an ad hoc real world measurements, then you need to make sure everything is correct. Don't just use some book formulas. I tihnk your best bet is to have some film in a step wedge of known thicknesses. Just look at what you got rather than some theoretical formulas.
Of course kmeans is not going to be useful. It doesn't take long thinking about it to realize that. Your best bet is to just calculate the delta E between the first image and subsequent images.
  2 Comments
Saad Bhamla
Saad Bhamla on 24 Mar 2014
Thanks for the response Image Analyst.
I agree with your idea of a calibration scale. However, it is more challenging to do experimentally (creating a step-wedge of exactly 200nm?).
I believe I have accounted for the spectral emittance of the light source.. I used the emission spectra for the white LED source as provided by the manufacturer and then convert it using the CIE color matching function.
- I have not yet figured out how to take into account the spectral responsivity of the camera - do you have any suggestions on this? Would this mean taking images of a standard color calibration like a color passport checker?
Can you elaborate more on your last comment about calculating the deltaE between consecutive images?
At this point, my attempt is to create an approximate conversion from the wavelength to the thickness information.
Thanks for your suggestions.
Image Analyst
Image Analyst on 24 Mar 2014
Convert the first image into LAB. Then for every subsequent image, convert it to LAB and then compute the Delta E on a pixel by pixel basis. Set some threshold for how much change you want to take notice of. For example the delta E must be more than 3 or whatever. That will tell you what pixels changed from the first frame. If you want you can compute delta E from prior frame or from a frame 30 frames back or whatever, instead of from the very first frame.
Regarding the camera, if you don't have a calibration then you don't know what colors you'll get. Even if you knew the spectral responsivities of the different colored pixels, you can get different RGB from the same scene. Let's say you're looking at blue and you have (0,120, 220). Well, just cut the exposure time in half and now you have (0, 60, 110). Using a fixed, uncalibrated formula for converting RGB into LAB and now suddenly you have a totally different LAB, and thus a different delta E. All you did was change the exposure time. If you use a flash it's even worse because flashes aren't consistent. Run my lab demo - it might help - it's attached.

Sign in to comment.

More Answers (0)

Categories

Find more on Convert Image Type in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!