The relentless advancements in imaging and display technology have greatly benefited many industrial and consumer fields. What has not progressed as fast are the tools to objectively determine image quality. There are several different conventional techniques to assess the relative improvement in image quality when an image-enhancing algorithm has been applied to a digital image. The testing of enhancing effects often consists of subjective quality assessments in experiments such as measuring the ability of an automatic target detection program to find a target before and after an image has been enhanced. While a particular algorithm may make an image appear substantially better after enhancement, there is no indication as to whether this improvement is significant enough to improve human visual performance. What would be an improvement to assessment techniques would be the removal of the human but with a technique modeled on human visual resolution qualities.
Air Force researchers have addressed this problem by developing a novel image quality assessment technique and algorithms that obtain the image, select a specific area of the image, and breaks it down into pixels of two differing colors. Distances associated with each corner of the binary image are calculated from the corner to the nearest tagged pixel in a row or column of pixels. The two shortest distances calculated are selected to determine an orientation of an object defined by the tagged pixels in the generated binary image. This models the human experience of determining the direction of an “E” on an eye chart. The calculated orientation can be compared to the known orientation and a score can be applied.
- This “morphological approach” is more computationally efficient and more accurate than other contemporary methods
- Assesses a variety of sensors such as visible, near infrared, short-wave infrared, and thermal
- Assesses imagery that has algorithmically combined (fused) information from multiple sensors
- Variations among different multispectral sensors and image registration, fusion, and enhancement algorithms may be accurately and automatically assessed in real-time
- Published US application number 20160330438 and US patent 9,679,216 available for license
- Potential for collaboration with Air Force researchers