Hilfe Wegweiser Impressum Kontakt Einloggen





A color indexing system based on perception (CISBOP)


Gegenfurtner,  KR
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar

Neumann, D., & Gegenfurtner, K. (2000). A color indexing system based on perception (CISBOP). Poster presented at 3. Tübinger Wahrnehmungskonferenz (TWK 2000), Tübingen, Germany.

Color is an important feature for searching large databases of images, since it is invariant with respect to camera position, object orientation and size, and partial occlusion. There are currently many color-based image indexing systems (e.g. Flickner et al., 1995), which all basically work by building color histograms in RGB space. Our goal was to construct a color-indexing system based on the known properties of the human color vision system. Our images were chosen from a large commercially available (COREL) image database consisting of 60.000 digitized photographs. For each image, we buildt a color histogram by converting the RGB triplets for each pixel into color-opponent coordinates. These luminance, red-green, and yellow-blue coordinates correspond to the color directions found in human color vision (Krauskopf, Williams Heeley, 1982). Luminance was averaged for each color value, and the resulting color circle was split into 127 segments. The categories were constructed so that the number of hue categories increased with increasing saturation. Six different rings were used for saturation, with the radius doubling as saturation increases. Thus, there is little discrimination of hues for unsaturated colors, whereas there are 64 different hues at the most saturated level. Two different histograms were built, one using the frequencies with which the different colors occurred, and another one that used the average luminance level of each color segment. We used a query-by-example stragey for searching. Several different distance measures were evaluated by asking human observers to make similarity judgments. For most images, this search, based on color only, results in images that are perceptually and often semantically similar to the target image.