Hacker News new | past | comments | ask | show | jobs | submit login

I wonder if it would be possible to write an app that could do it using the camera in a smart phone? The idea is that you would take some photos of some reference objects of known color under sunlight, and then take photos of those same objects under the lighting you are testing, and then figure out from the differences in the photos what the CRI must be.

The reference objects could be well known, easy to obtain household items, like a Pepsi can, a plastic bottle of Tide detergent, or box of Arm & Hammer baking soda. If the app included a reasonably good sized database of such items, there would be a good chance an average user would have a suitable set of color references already on hand.




I'm afraid that phone camera sensors are filtered to R/G/B, so a strong monochromatic orange, for example, may end up under-represented or even absent in the captured image, and will likely be hard to distinguish from an overlap between red and green.

Maybe modern image processing techniques can figure it out, but this seems like a huge problem to me.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: