IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
1153
Toward a Unified Color Space for Perception-Based Image Processing Ingmar Lissner and Philipp Urban
Abstract—Image processing methods that utilize characteristics of the human visual system require color spaces with certain properties to operate effectively. After analyzing different types of perception-based image processing problems, we present a list of properties that a unified color space should have. Due to contradictory perceptual phenomena and geometric issues, a color space cannot incorporate all these properties. We therefore identify the most important properties and focus on creating opponent color spaces without cross contamination between color attributes (i.e., lightness, chroma, and hue) and with maximum perceptual uniformity induced by color-difference formulas. Color lookup tables define simple transformations from an initial color space to the new spaces. We calculate such tables using multigrid optimization considering the Hung and Berns data of constant perceived hue and the CMC, CIE94, and CIEDE2000 color-difference formulas. The resulting color spaces exhibit low cross contamination between color attributes and are only slightly less perceptually uniform than spaces optimized exclusively for perceptual uniformity. We compare the CIEDE2000-based space with commonly used color spaces in two examples of perception-based image processing. In both cases, standard methods show improved results if the new space is used. All color-space transformations and examples are provided as MATLAB codes on our website. Index Terms—Color space, hue linearity, perceptual uniformity.
I. INTRODUCTION
P
ERCEPTION-based image processing methods require color spaces with certain properties to operate effectively. Engineers can choose from a variety of color spaces. This requires profound knowledge of the spaces’ perceptual characteristics and the requirements of the particular application. Identifying the most frequently required properties and incorporating them into a unified color space would simplify this selection. We focus on 3-D color spaces [1], where each point is defined by three unique color attributes: lightness, chroma, and hue. The axes and the arrangement of the colors define the perceptual properties of the space. Our aim is to create a color space that can be used to improve perception-based image processing methods by simply replacing their working color spaces. We use Manuscript received May 05, 2011; accepted July 15, 2011. Date of publication August 04, 2011; date of current version February 17, 2012. This work was supported in part by the German Research Foundation. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Xiaolin Wu. The authors are with the Institute of Printing Science and Technology, Technische Universität Darmstadt, 64289 Darmstadt, Germany (e-mail:
[email protected];
[email protected]). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIP.2011.2163522
an empirical approach considering certain physiological aspects of the human visual system. In the following, we analyze the color-space requirements of different perception-based image processing problems. Please note that we only discuss a small subset of potential applications. A. Lossy Image Compression In lossy image compression, it is desirable to remove imperceptible details from the images. Two perceptual phenomena have to be considered when selecting an appropriate color space: 1) The contrast sensitivity at high spatial frequencies is higher for the achromatic channel than for the chromatic channels [2], [3]. An individual treatment of these channels is simplified by a working color space with one achromatic and two chromatic axes. In addition, changes in the achromatic channel must not have perceptible influence on the chromatic channels and vice versa (i.e., absence of cross contamination; a definition with respect to color attributes is given in Definition 3 in the Appendix). 2) Not all colors in a color space are distinguishable. A space with a metric (distance formula) that accurately predicts perceived color differences allows for perception-based color quantization: indistinguishable colors can be reduced to a single color without affecting the appearance of the image. In this context, a color space where the Euclidean metric accurately predicts perceived color differences is particularly beneficial. This implies that equal-distance contours around colors are equally sized spheres. Such an arrangement of colors facilitates efficient quantization because it is straightforward to determine colors below the just noticeable distance or the just tolerable distance. A space with this property is called perceptually uniform (see Definition 1 in the Appendix). B. Color Gamut Mapping Color gamut mapping denotes the mapping of an image’s colors to colors that are reproducible on a specific device, e.g., a display or a printer. A typical objective of such a transformation is to minimize the perceived difference between the original and the reproduction. Preserving the relative differences between colors is often more important than preserving their absolute values. In [4], three properties of a color space are particularly advantageous to gamut-mapping algorithms: 1) To preserve the perceived relations between colors to the greatest extent, a controlled and separable adjustment of lightness, chroma, and hue is required—e.g., the ability to
1057-7149/$26.00 © 2011 IEEE
1154
change lightness and chroma while preserving hue. Therefore, a color space with lightness, chroma, and hue axes can be directly used by gamut-mapping algorithms. Changing the value of a predicted attribute must not have perceptible influence on the others (i.e., absence of cross contamination, see Definition 3 in the Appendix). 2) To maintain the perceived relations between the image’s colors, the predicted color attributes of the space should be linearly related to the corresponding perceived attributes. If, for instance, a gamut-mapping algorithm reduces a color’s chroma by 50%, the perceived chroma should be reduced by the same amount. 3) Perceptual uniformity is also desirable because it is easier to preserve color-difference ratios if Euclidean distances agree with perceived color differences. In addition, the representation of gamut boundaries in a perceptually uniform color space enables a more meaningful gamut comparison. This is particularly useful for estimating the magnitude of color changes due to gamut differences. C. Segmentation Image segmentation algorithms partition an image by grouping pixels with common visual characteristics. In colorrelated applications, the most important characteristics are the attributes of colors and the differences between colors. Therefore, a color space for image segmentation should possess the following properties: separate lightness, chroma, and hue axes for easy access to color attributes and perceptual uniformity to allow a simple evaluation of perceived color differences [5], [6]. D. Denoising Reconstructing the original image from a noisy copy is a common problem in perception-based image processing. The main challenge is to remove high-frequency noise and preserve edges at the same time. The contrast sensitivity of the human visual system is different for the achromatic and chromatic channels (see Section I-A). As a result, noise is less noticeable in the chromatic channels [7]. A color space that enables individual denoising of these channels is preferable—this implies separate achromatic and chromatic axes without cross contamination. Some denoising algorithms also utilize perceptual uniformity [8]. E. Characterizing and Optimizing Imaging Systems The characterization and optimization of imaging systems requires an objective function that often depends on color differences. An example is the optimization of inks to maximize the number of colors that are reproducible by a printer. This can be seen as maximizing the printer’s gamut volume in a perceptually uniform color space [9]. Another example is color correction, i.e., the transformation of RGB camera responses to a device-independent color space (e.g., sRGB). The most important objective of such a transformation is to minimize the perceived color errors, which can be simplified by using a perceptually uniform color space [10].
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
F. Image Enhancement Image enhancement denotes the adjustment of an image to meet certain aesthetic criteria (e.g., a highly chromatic blue sky is preferred to a gray sky). Simple access to color attributes is therefore important. In addition, the predicted color attributes of the space should be free of cross contamination and linearly related to perceived attributes. Perceptual uniformity is also desirable. G. General Considerations There are various other properties that may be required by image processing methods. Typically, the RGB colors of an input image are transformed to the working color space of the method. The resulting image is then transformed back to an RGB color space for storage or display. This requires the transformation to be invertible. In addition, low computational complexity is beneficial. In applications where color differences are important, they should agree with standardized color-difference formulas (e.g., CMC [11], CIE94 [12], or CIEDE2000 [13]). These formulas are used to define tolerance thresholds in various international and industrial standards. A color space where the Euclidean metric reflects a color-difference formula would therefore be convenient, even if the formula does not perfectly predict perceived color differences. H. Conclusion and Color-Space Examples In many perception-based image processing applications, a color space is desirable where: 1) One axis represents the achromatic channel. 2) The access to color attributes (i.e., lightness, chroma, and hue) is simple. 3) Changing a color attribute has no perceptible effects on the other attributes (no cross contamination). 4) The changes of predicted color attributes are linearly related to the changes of perceived attributes. 5) The Euclidean metric agrees with perceived color differences (perceptual uniformity) or, at least, with standardized color-difference formulas. 6) The transformation to standardized RGB color spaces is invertible and has low computational complexity. Color spaces that possess some of these properties and that are used by perception-based methods incorporate Young and Helmholtz’s three-photoreceptor model [1] and Hering’s opponent color representation [14] with black–white, blue–yellow, and red–green axes. To account for nonlinear processing of the human visual system, the spaces are based on nonlinear transformations of intensity-linear signals (e.g., linear RGB, CIEXYZ, or LMS values). Predictions of color attributes can be calculated using a transformation from Cartesian to cylindrical coordinates. Examples of spaces that are commonly used in perceptionbased image processing include HSL, YCbCr, CIELAB, and IPT [15] (HSL is already represented in cylindrical coordinates). These spaces were designed to possess some of the properties listed above. For instance, the transformation of gamma-adjusted RGB colors to HSL or YCbCr has low computational
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
complexity, which is particularly important for video processing (e.g., MPEG compression). However, there is cross contamination between the achromatic and chromatic axes [16], and the spaces are not perceptually uniform (see Fig. 8). The CIELAB color space, which is designed to be perceptually uniform, is not free of cross contamination between chroma and hue (particularly in the red and blue regions [17]). In addition, visual experiments illustrate CIELAB’s lack of perceptual uniformity [18], [19]. As a result, the space is insufficient for many color-related applications. IPT is designed to be free of cross contamination between chroma and hue, often referred to as hue linearity. The transformation from LMS signals to IPT is simple and has low computational complexity. However, noise experiments show that there is cross contamination between IPT’s achromatic and chromatic axes [20], and that its perceptual uniformity is only slightly better than that of CIELAB [21]. In summary, the structure of the above spaces is similar, but each space lacks some of the desired properties. In the next section, we investigate the possibility of creating a color space that combines all of these properties. II. PITFALLS AND LIMITATIONS Color perception is far too complex to be modeled comprehensively by a 3-D color space. Particularly, the viewing conditions have a tremendous effect on how we perceive colors. Traditional color spaces assume a fixed set of viewing conditions (illuminant, luminance level, surround, etc.). If the viewing conditions change, the perceptual properties of the spaces change as well. To account for variable viewing conditions, the so-called color appearance models (CAMs) were developed [22]. Using chromatic adaptation and other transformations, these models normalize the stimuli to specific viewing conditions and consider various color appearance phenomena. The results are coordinates in an opponent color space or the corresponding color attributes. Depending on the viewing conditions, a CAM maps similar stimuli to different color attributes. The most recent CAM proposed by the CIE is CIECAM02 [23], [24]. CAMs do not consider the arrangement of colors in images and its effect on our perception. This is particularly important when calculating color differences between pixels using color-difference formulas. These formulas are based on experiments with simple color patches on a uniform background. When viewing images, the conditions are usually different (stimulus size, surround, luminance level, illuminant, etc.), and the color-difference predictions do not correlate well with our perception [25]. For this purpose, image appearance models (IAMs) were developed as an extension of CAMs. Most IAMs transform each pixel into an opponent color space as well, taking into account different viewing conditions and the contrast sensitivity of the human visual system. Examples of IAMs include S-CIELAB [26] and iCAM [27], [28]. S-CIELAB transforms each pixel into CIELAB, whereas iCAM uses IPT. Color differences can be predicted using color-difference formulas [29] or the Euclidean metric if the color space is perceptually uniform. A color space with the properties from Section I-H could be used by CAMs or IAMs to represent the predicted color attributes. However, CAMs and IAMs are still an active research area, and their prediction accuracy can be improved. In addition,
1155
we cannot expect that images are transformed by an IAM beforehand, not to mention that, in most applications, the viewing conditions cannot be accurately defined. Therefore, even if a color space possesses all these properties, it may not perform optimally because the assumed conditions usually differ from the actual conditions. Even if the viewing conditions are perfectly matched, the question remains whether we can arrange colors in a 3-D space to induce the desired properties. Indeed, the literature indicates that some of the properties from Section I-H cannot be realized. For instance, visual experiments show that color-difference perception is most likely not Euclidean, which means that a perceptually uniform color space does not exist [1], [30]. The reason is that a Riemannian space based on experimental data has nonzero Gaussian curvature, which is an intrinsic local property of a space and invariant to isometric (length-preserving) transformations (Theorema Egregium [31], [32]). Since a Euclidean space has zero Gaussian curvature, an isometric embedding of a space with nonzero Gaussian curvature into a Euclidean space is impossible. Urban et al. have shown that the CIELAB-based CMC, CIE94, and CIEDE2000 formulas have nonzero Gaussian curvature as well [33]. The following observation provides additional evidence that a perceptually uniform color space does not exist: Let and be colors in a color space with perceived distance . Let be a third color on the geodesic curve (shortest path) between and and satisfying , where and are perceived differences as well. In a color space that can be isometrically transformed from a Euclidean space, we could calculate: . Unfortunately, experiments show that distance is judged significantly smaller than the sum of the perceived color differences and . This phenomenon—known as diminishing returns in color-difference perception [34]—makes it impossible to arrange colors such that their Euclidean distances match their perceived differences for small, medium, and large differences at the same time. There are also global color arrangement problems caused by the structure of an opponent color space. If we change the hue of a color with low chroma (close to the gray axis), the perceived color difference is small. If we change the hue of a highly chromatic color by the same amount, the resulting color shift is much more noticeable. This also affects perceived hue differences between colors on neighboring constant-hue lines: These differences vary depending on the chroma level (this phenomenon is called paradox of hue differences) [1], [35]. Therefore, predicted color attributes cannot be linearly related to perceived attributes in a color space without cross contamination. In conclusion, we have to relax some of the requirements. In the next section, we discuss the importance of individual properties and propose a method to combine them in a unified color space. III. CONSTRUCTING THE COLOR SPACE The structure of the new color space should reflect human physiology, which implies an opponent space. This agrees with the structure of the spaces mentioned in Section I-H. Many perception-based image processing algorithms consider the different achromatic and chromatic spatial frequency responses of the human visual system, e.g., the contrast sensitivities. Linear filtering with the contrast sensitivity functions
1156
(to remove imperceptible content from an image) should be performed in a color space that is linearly related to the LMS cone-excitation space [16]. However, such intensity-linear spaces do not provide simple access to color attributes and are far from being perceptually uniform. A crucial property of a unified color space used for further processing is the absence of cross contamination between the achromatic and chromatic axes. Noise matching as well as noise visibility experiments (threshold and suprathreshold) show that “CIE luminance, , is highly correlated with the most important dimension in noise perception” [16]. Therefore, a good starting point would be a color space with a lightness axis directly derived from CIE luminance [16]. CIELAB is such a space and shows only low cross contamination between the lightness and chromatic components, but it has other shortcomings, as described above. Please note that this design introduces some cross contamination due to the Helmholtz–Kohlrausch effect (i.e., perceived lightness depends not only on luminance but also on chromaticity [22]). If required, the lightness predictor can be modified to account for this effect [22]. Perceptual uniformity is advantageous to almost any perception-based image processing application. Unfortunately, it cannot be realized for different magnitude values of color differences due to the phenomenon of diminishing returns in color-difference perception. This means that we have to choose a distance range for which the perceptual uniformity is optimized. Optimizing the prediction of large color differences would cause small differences to be underestimated by the Euclidean metric. This would complicate the fine-tuning of perception-based image processing algorithms. We therefore believe that perceptual uniformity should be adjusted to small suprathreshold color differences, although medium and large differences are overestimated as a consequence. Another question is how to incorporate experimental colordiscrimination data into the space. The simplest way is to use a color-difference formula as a reference of perceptual uniformity because such formulas are fitted to visual data (e.g., the RITDuPont [18], BFD [19], Leeds [36], and Witt [37] data sets) and are designed to predict small suprathreshold color differences. Although their predictions are not perfectly accurate, they significantly improve the perceptual uniformity of all spaces listed in Section I-H. In addition, this approach would be advantageous to some applications, as described in Section I-G. If the prediction performance of existing color-difference formulas is deemed insufficient, it can be improved with additional visual data using Gaussian process regression [38], [39]. In this paper, we only use the standardized CMC, CIE94, and CIEDE2000 formulas for better reproducibility. Urban et al. created Euclidean color spaces (LABCMC, LAB94, and LAB2000) with minimal isometric disagreement with the corresponding CIELAB-based color-difference formulas [33]. However, these spaces show significant cross contamination between chroma and hue [40], which is undesirable in many applications (e.g., gamut mapping and image enhancement). To construct a unified color space with low cross contamination for perception-based image processing, we need to relax the requirement of perceptual uniformity.
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Due to the paradox of hue differences, we have to sacrifice the desired linear relation between changes of predicted and perceived color attributes to keep the cross contamination low. In summary, we focus on constructing opponent color spaces with low cross contamination between color attributes (i.e., lightness, chroma, and hue), and maximum perceptual uniformity with respect to standardized color-difference formulas. In the following, we show how to incorporate these global (absence of cross contamination) and local (perceptual uniformity) properties into new color spaces. Instead of fitting the parameters of predefined functions to experimental data, we employ an approach by Urban et al. [33] and use color lookup tables (CLUTs) to transform a common color space into the new spaces with the desired properties. Such transformations can be accurately adjusted to visual data and have extremely low computational complexity, which is why they are extensively used in color management systems (e.g., ICC [41]). We ensure that the resulting CLUT transformations are invertible and show that the inverse transformations can be realized as CLUTs as well. In a previous attempt to create a hue linear color space with a high degree of perceptual uniformity [40], the hue correction was limited to the blue region of the CIELAB color space. In addition, we did not allow chroma changes during the perceptual-uniformity optimization. An in-depth analysis of various image processing problems and their color-space requirements (see Section I) was also not provided in this paper. A. Preconditions Our method requires an initial color space where the lightness predictor is independent of the chromatic predictors. An example of such a space is CIELAB, as described above. A color-difference formula defined on this space is required as well. Its predictions should correlate well with perceived color differences, and it has to treat lightness differences independently of chroma and hue differences. CMC, CIE94, and CIEDE2000 are examples of such formulas. We also need visual data quantifying constant perceived hue in the selected color space, e.g., the lines of constant perceived hue collected by Hung and Berns in [42]. The data were interpolated and extrapolated to cover the -plane in CIELAB in [15]. Please note that these constant-hue loci were determined using a cathode-ray tube (CRT) display and may differ for object colors. B. Lightness Transformation We transform the CIELAB lightness into the new lightness using the color-difference formula . Note that the formula must predict lightness differences independently of chroma and hue differences. As a result, the achromatic axis, which is the new lightness axis, depends only on the CIE luminance and remains independent of the chromatic axes. To calculate the new lightness axis , we integrate the colordifference formula along the lightness dimension [33]. The integration can be approximated by a cumulative sum for a set of lightness values , (1)
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
where and stored in a 1-D lookup table with are determined by interpolation.
1157
. The results can be entries. Intermediate points
C. Chroma and Hue Transformation The chroma and hue transformation is much more complex. We have to maximize the degree of perceptual uniformity ensuring the absence of cross contamination between chroma and hue. For this purpose, we use an iterative multigrid optimization starting from a grid that covers the plane of constant lightness in the initial color space. The grid’s vertices are shifted in each iteration to achieve the desired properties. The initial grid and the grid resulting from the optimization are combined in a lookup table defining the chroma and hue transformation. The concept is similar to the approach in [33]. However, the multigrid optimization was fundamentally modified to incorporate the additional global properties. A key novelty is the combination of an elastic and an inelastic grid to prevent cross contamination between chroma and hue. A step-by-step description of the optimization is provided in the following, starting with the terminology. 1) Terminology: To simplify the notation, we define , where . Definition 1: Grid and Vertex: Let formation:
be an index set. Grid
is defined as a trans-
Fig. 1. (a) Invalid mesh (overlapping triangles). (b) Valid mesh.
Fig. 2. (a) Invalid grid (overlapping meshes). (b) Valid grid.
where
is the largest open set within mesh and . An example of an invalid grid is shown in Fig. 2(a). Fig. 2(b) shows a valid grid with two meshes. The union of the grid’s meshes is called the grid domain and is denoted by (6)
(2) where
and
is called vertex.
Definition 4: Grid Transformation:
Definition 2: Mesh: Let be a grid defined on the index set following set:
and
be the (3)
is the union The corresponding mesh of two triangles and satisfying , where denotes the largest open subset of . Each triangle is defined by three different vertices , , and as follows:
The vertices of the invertible valid grid can be uniquely mapped to the vertices of the valid grid defined on the same index set by . This induces a family of transformations between the grid domains of these grids. Each of the transformations has the following form: if where the mesh transformation triangular interpolation:
(7)
is defined as the following
(4) This prevents overlapping triangles, as in Fig. 1(a). A valid mesh is shown in Fig. 1(b). This triangle-based mesh definition is required for Definition 4. The elements of are the vertices of the mesh . where
Definition 3: Valid Grid and Grid Domain: Grid on the index set following condition is satisfied:
(8)
is called valid if the (5)
, , and are vertices of mesh defining triangle and (see Fig. 3). is the area of the triangle defined by , , and . As there are several ways to divide a mesh into triangles, transformation is not unique. In our calculations, a mesh triangulation is determined in the beginning and used in all subsequent transformations.
1158
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Fig. 3. (a) Example of mesh with (gray) triangle and . (b) Corresponding mesh within containing the the input point . transformed point
Definition 5: Nested Grids: Let be a grid with the index set , and let be a grid with the index set . and are called nested grids if they satisfy the following condition:
(9) is the th mesh of , is the th mesh of , where and . This means that each mesh of contains exactly one vertex of , and each mesh of contains exactly one vertex of . Since is larger than , the outermost vertices of are not enclosed by any mesh of . Comment: Polar and Cartesian Vertices: In this paper, grid vertices are expressed in polar coordinates by default. For grid , the polar coordinates of a vertex are denoted by , according to Definition 1. Corresponding Cartesian coordinates are denoted by . 2) Create Initial Grid : We start with grid with the index set . The grid domain covers the plane of constant lightness in CIELAB. Please note that the distance between two neighboring vertices should not exceed a certain distance to enable color-difference evaluation with our formula (e.g., for an evaluation with CIE94 and CIEDE2000, [43]). Each vertex is a column vector of the chroma value and the hue angle as follows: (10) All center vertices with chroma but different hue angles, i.e.,
have zero
(11) For example, if the Hung and Berns data serve as a reference of constant perceived hue, , where and . Grid is designed such that vertices , where , with , lie on lines of constant perceived
Fig. 4. (a) Exemplary grid in CIELAB. (b) Exemplary nested grids (transformed with and hue corrected) and that are used as initial grids for the multigrid optimization.
hue. Fig. 4(a) shows an example of such a grid in the CIELAB -plane (with a small number of vertices for the sake of clarity). 3) Constant-Lightness-Plane Transformation: Grid is transformed to a plane of constant lightness in an approximately perceptually uniform color space (with respect to our color-difference formula). This 2-D transformation of chroma and hue is based on the previous work by Urban et al. [33]. Corresponding transformations for the CMC, CIE94, and CIEDE2000 color-difference formulas are available online [44]. 4) Hue Correction: A hue correction is then applied to the transformed grid. Since vertices and already lie on lines of constant perceived hue, we can set their hue angles to the same value to bring predicted and perceived hues into agreement. Consequently, the hue angle of each
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
vertex , where , is set to the hue angle of the corresponding center vertex with index as follows:
1159
the center vertices of are not included twice since they correspond to the same point). Disagreement , where , of point within mesh of grid is determined analogously as follows: (15) where
(12) where is the transformation to the plane of constant lightness, as mentioned above. Grid is used as an initial grid in the following optimization. It is hue linear (vertices with the same hue angle have the same perceived hue), but the perceptual uniformity is decreased by the hue correction. This is why a multigrid optimization is performed to increase the perceptual uniformity of . 5) Create the Stabilizing Grid : Before the multigrid optimization, grid with the index set is created to stabilize the optimization. Grids and are nested grids according to Definition 5: each mesh of a grid contains exactly one vertex of the other grid. In the beginning, the vertices of coincide with the centers of gravity of their enclosing meshes. During the multigrid optimization, grid is frequently regenerated to restore this state. Fig. 4(b) shows an example of two such nested grids. 6) Multigrid Optimization: A multigrid optimization is then performed to make the starting grid more perceptually uniform without sacrificing the hue linearity. Figs. 5 and 6 provide an overview and a pseudocode implementation of the multigrid optimization. A detailed description of the method is given in the following. During the multigrid optimization, the vertices of and are shifted to minimize the disagreement between Euclidean and distances. The distance , where and , between vertex and a vertex of its enclosing mesh is defined as (13) Note that we have to transform the grid vertices to the color space where our color-difference formula is defined (i.e., CIELAB in our case). Although requires 3-D arguments, we omit the lightness dimension for reasons of simplicity (the lightness difference is zero). We define the disagreement of point within the mesh of grid as follows:
(14)
(16) , where , We define disagreement analogously; however, the sum in (15) is computed for , with . Our iterative algorithm comprises five steps and is divided into an inner and an outer loop. Only steps a and b are performed in the inner loop, which is executed times before entering the outer loop. Steps c, d, and e are only performed in the outer loop, which is executed times (see Fig. 5). The perceived distances and remain constant in the inner loop; they are only updated in the outer loop. a) Shift Vertices of : The vertices of may be shifted in any direction to reduce disagreement . In every iteration, a shift vector in the steepest descent direction is computed for each vertex as follows: (17) , where where “ ” is the gradient operator. Vector , is multiplied with factor , which is chroma dependent to stabilize the method. It is then added to as follows: (18) where (19) which is subject to the constraint that and remain nested grids, according to Definition 5. The value of remains constant throughout the optimization. Fig. 5 (1) illustrates how the vertices of are shifted. The optimization continues with the updated grid, i.e., . b) Shift vertices of in chroma direction: The vertices of are only shifted in chroma direction; the hue angle must stay unchanged to preserve the hue linearity as follows: (20) where the step size
is set to (21)
Note that is the disagreement between and the Euclidean distances of the vertex and the vertices of its enclosing mesh. Disagreements , where , are only computed from three vertices (i.e.,
subject to the constraint that and remain nested grids according to Definition 5. Fig. 5 (2) illustrates how the vertices of are shifted. The optimization continues with the updated grid, i.e., .
1160
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Fig. 5. Overview of the computation of the color-space transformation. Indices indicating the current iteration are not shown for the sake of simplicity.
c) Smooth : To stabilize the method, grid is linearly smoothed using a 3 3 Gaussian filter kernel. This smoothing does not affect the hue angles of the vertices—the grid remains hue linear throughout the optimization. d) Regenerate : In some cases, the vertices of the flexible grid may come close to the boundaries of their enclosing meshes of . In accordance with Definition 5, the vertices must not leave the meshes. To allow further movement of the vertices of both grids, is regenerated after each execution of the inner
loop. The vertices of the new grid are located at the centers of gravity of their enclosing meshes as follows: (22) After has been regenerated, the perceived distances and [see (13) and (16)] have to be recomputed using the current grids and .
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
1161
the fan (hue linearity). A stable state is reached if the sum of all forces in each node is zero. However, there may still be tension in the springs, which indicates a disagreement between and Euclidean distances caused by nonzero Gaussian curvature or global constraints (hue linearity). 7) Final Chroma and Hue Transformation: The final chroma and hue transformation is defined by the grid transformation (see Section III-C-1, Definition 4), where is the initial grid and is the final grid after iterations of the multigrid optimization. If values are to be transformed, the transformation can also be defined in Cartesian coordinates as . D. Final Color-Space Transformation
Fig. 6. Pseudocode implementation of the multigrid optimization.
e) Shift Constant-Hue Lines of : In this optimization step, we allow the angles of the constant-hue lines to change if this reduces the disagreement, as defined in (24),
The final color-space transformation from CIELAB to the new color space is defined by a 1-D and two 2-D lookup tables as follows: CIELAB
(23) where the step size
is set to (24)
where . The sum in (24) computes a cumulative disagreement for all vertices on the th constant-hue line. Termination: In our implementation of the optimization, the numbers and of inner- and outer-loop iterations are fixed. More sophisticated termination conditions can be used as well, such as stopping when the mean disagreement or the maximum disagreement of all vertices of does not change over a certain number of iterations. Comments on the algorithm: 1) As already mentioned, the method proposed in this paper is based on an approach by Urban et al. [33]. Our new approach presented here uses an additional hue correction, a hue-preserving multigrid optimization, and a circular grid topology. 2) The following analogy may facilitate a better understanding of the multigrid optimization. Imagine a fully opened hand fan whose slats cover 360 . The stiff slats represent the lines of constant hue that are connected at the center, where a hinge allows the slats to rotate independently of each other. The outermost slats are connected to each other to keep the fan open. A net of springs is then attached to the fan using rings that allow the springs to slide freely along the slats. The nodes of this elastic net represent the vertices of grids and . A spring is stretched if the difference between its endings is smaller than their Euclidean distance. If the difference is larger than the Euclidean distance, the spring is compressed. At the beginning of the multigrid optimization, the overall tension of the springs is high. If we allow the springs to release their tension, the nodes move to reduce their energy. The nodes attached to a slat may force the entire slat to rotate, thus changing the angle to the neighboring slats. The nodes can also move along the slats they are attached to, preserving the structure of
(25) where and are described in Sections III-B and III-C-7, respectively. The inverse transformation from the new color space to CIELAB is defined by the inverse lookup tables. They are well defined because the lightness transformation is strictly monotonically increasing and because the chroma and hue transformations are defined by 2-D invertible grids. The inverse transformation is as follows: CIELAB
CIELAB
(26) The 2-D transformations and can be approximated by 2-D regular lookup tables in Cartesian coordinates. This increases the computational performance due to the direct access to a point’s enclosing mesh and possible bilinear interpolation using all four mesh vertices. Such 2-D regular lookup tables can be computed using an initial regular Cartesian grid and the grid obtained by transforming all vertices of by or , respectively. IV. RESULTS AND DISCUSSION Using the proposed method with CIELAB as an initial color space and the Hung and Berns data as a reference of constant hue, we created the following hue linear color spaces: LABCMC (based on CMC), LAB94 (based on CIE94), and LAB2000 (based on CIEDE2000), where “HL” stands for “hue linear.” The method was parametrized as shown in Table I. For the sake of simplicity, we focus on LAB2000 in this section—the other two spaces are only included in the numerical comparison of the perceptual uniformity (see Table II). We compared LAB2000 with five other spaces: the HSL color space, the YCbCr color space (used by the JPEG compression method), the CIELAB color space, the hue linear IPT color space [15], and the LAB2000 color space [33], which is approximately perceptually uniform with respect to the CIEDE2000 color-difference formula.
1162
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Fig. 7. Blue gradients computed in six color spaces. The start and end points of all gradients are the same. The start point lies on the gray axis. All gradient colors are inside the sRGB gamut. The intermediate points sample the line between start and end points equidistantly in each color space (the hue angle remains constant). The gradients were converted to sRGB for display. In a hue linear color space, the perceived hue remains constant along the gradient. (a) HSL. (b) YCbCr. (c) CIELAB. (d) IPT. (e) LAB2000. (f) LAB2000 .
TABLE I PARAMETERS OF THE METHOD
Please note that your perception of the colored illustrations in this section may vary depending on the viewing conditions. Fig. 7 shows blue gradients computed in the above color spaces. The predicted hue is constant along these gradients (according to the hue predictors of the color spaces). The perceived hue, however, changes from blue to purple along the CIELAB and LAB2000 gradients. A slight hue shift in the same direction is also visible for the HSL and YCbCr gradients on a calibrated display. The hue linear IPT and LAB2000 spaces illustrate the importance of the Hung and Berns hue correction for blue colors. The perceived hue is maintained along these gradients. Fig. 8 shows colors selected from the gradients in Fig. 7. Two color pairs are shown for each color space. The Euclidean distances between the upper and lower pairs are equal in the respective space. All color pairs with this particular Euclidean distance were collected from the gradients, and those with maximum (upper pair) and minimum CIEDE2000 difference (lower pair) were shown. The perceived distance between upper and lower pairs must be similar if the respective color space is perceptually uniform. However, the perceived difference of the upper colors
TABLE II STRESS VALUES QUANTIFYING THE DISAGREEMENT BETWEEN EUCLIDEAN DISTANCES IN TEN COLOR SPACES AND PREDICTIONS OF PERCEIVED COLOR DIFFERENCES COMPUTED BY THREE COLOR-DIFFERENCE FORMULAS. UNDERLINED VALUES INDICATE THAT THE COLOR SPACE WAS CREATED USING THE CORRESPONDING COLOR-DIFFERENCE FORMULA AS A REFERENCE OF PERCEPTUAL UNIFORMITY. NOTE THAT THE LABCMC, LAB94, AND LAB2000 SPACES [33] ARE NOT HUE LINEAR. ALL STRESS VALUES WERE COMPUTED ON TWO MILLION RANDOM COLOR PAIRS FROM THE SRGB GAMUT
is significantly smaller for HSL, YCbCr, CIELAB, and IPT [see Fig. 8(a)–(d)]. This illustrates the perceptual nonuniformity of these spaces. Although the perceived color differences depend on the viewing conditions, these discrepancies should be noticeable in all cases. For LAB2000 and LAB2000 , the perceived differences between the upper and the lower colors are similar [see Fig. 8(e) and (f)]. This is particularly evident when viewing the colors on a calibrated display. Table II provides a comparison of the perceptual uniformity of ten color spaces with respect to CMC, CIE94, and
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
1163
Fig. 8. Blue color pairs selected in six color spaces. The Euclidean distance between the upper two colors is equal to the distance between the lower two colors in the respective color space. However, the perceived differences between the upper and lower pairs may differ depending on the perceptual uniformity of the space. CIEDE2000 color-difference predictions are shown below the figures. (a) HSL. (b) YCbCr. (c) CIELAB. (d) IPT. (e) LAB2000. (f) LAB2000 .
CIEDE2000. The scale-invariant STRESS index [45] is used to quantify the disagreement between Euclidean distances in the evaluated color spaces and corresponding color-difference predictions. A STRESS value of zero indicates no disagreement; higher values correspond to higher disagreements. Please note that the color-difference formulas require a sRGB conversions D65/10 white point, whereas the XYZ assume a D65/2 white point, which introduces some error into our calculations. It should be mentioned that the CIEDE2000 predictions do not perfectly agree with perceived color differences—the STRESS value between the highly reliable RIT-DuPont data set [18] and the corresponding CIEDE2000 predictions is 19.47 [46]. In addition, the color-difference predictions are only valid for a specific set of viewing parameters. The parametric , and are therefore included in the formula factors , to allow an adjustment to the viewing conditions [13], [47]. Despite these limitations, the formula provides a good estimate of perceived color differences: a high STRESS value between CIEDE2000 and Euclidean distances in a color space indicates that the space lacks perceptual uniformity. It is evident that Euclidean distances in LAB2000 agree best with CIEDE2000 color differences. This is not surprising since the space has been designed using CIEDE2000 as a reference of perceptual uniformity. Although the STRESS value for the
space is significantly higher than the hue linear LAB2000 LAB2000 STRESS value (10.64 versus 3.83), it is still much lower than those of the other spaces. A comparison of the perceptual uniformity is also provided in Fig. 9, which shows CIEDE2000 isodistance contours in four of the color spaces. V. EXAMPLES To illustrate the performance of the proposed CIEDE2000based color space, we provide two examples of perceptionbased image processing: gamut mapping and image segmentation. We chose simple methods that rely on their working color space’s perceptual uniformity and the separability of perceived lightness, chroma, and hue without cross contamination. The parameters of these methods were not adjusted to the evaluated color spaces. The aim is to demonstrate that perception-based image processing algorithms can be improved by changing their working color spaces. Visual experiments were conducted to judge the results. A. Gamut Mapping Our gamut-mapping method performs lightness- and hue-preserving chroma compression. The chroma of colors in the upper 50% chroma range of the reproduction gamut
1164
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Fig. 9. CIEDE2000 isodistance contours in (a) CIELAB, (b) IPT, (c) LAB2000, (d) LAB2000 . Each point on a contour has a distance of to the center of the contour. In a perceptually uniform color space, the contours are equally sized circles (assuming that CIEDE2000 accurately reflects perceived color differences). Projections of the sRGB gamut in the respective color spaces are shown in gray.
is linearly compressed. Colors with lower chroma (closer to the gray axis) are not shifted. Color spaces typically used for gamut mapping include hue-corrected CIELAB and IPT. To illustrate the cross contamination of chroma and hue, we used the uncorrected CIELAB space in our example. Because it is difficult to reproduce colors in printed journals [48], the examples are most accurately reflected in the electronic version. We converted our original image to sRGB because sRGB colors can be displayed on most monitors. The dis-
play should be calibrated accordingly when viewing the image. The reproduction gamut was computed from the FOGRA27 ICC profile [49]. The gamut-mapped images were calculated in three steps: 1) the original sRGB image was transformed to one of the four evaluated color spaces; 2) the gamut mapping was performed in this color space; and 3) the image was transformed back to sRGB for display.
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
1165
Fig. 10. Results of gamut mapping (lightness- and hue-preserving chroma compression) in four color spaces. The images are split showing the original image on the left and the gamut-mapped image on the right. Original gamut: sRGB. Reproduction gamut: FOGRA27. The original image is part of Mark Fairchild’s HDR Photographic Survey [50], [51]. (a) CIELAB. (b) IPT. (c) LAB2000. (d) LAB2000 .
The test image is part of Mark Fairchild’s HDR Photographic Survey [50], [51]. We chose a predominantly blue image because the cross contamination of chroma and hue is particularly apparent in this CIELAB region—although CIELAB has cross contamination problems in other regions as well (e.g., the red region). We also increased the chroma of the sky to emphasize changes in hue. Fig. 10 shows the gamut-mapping results. The images are split, showing the original image on the left and the gamut-mapped image on the right. As expected, the gamut mapping in CIELAB introduces a hue shift: blue colors become purple, although the CIELAB hue is maintained [see Fig. 10(a)]. This is particularly evident in the sky. A slight hue shift in the same direction is also noticeable in the LAB2000 results [see Fig. 10(c)]. In contrast, the perceived hue remains constant when the IPT color space is used [see Fig. 10(b)]. There is, however, some cross contamination between lightness and the chromatic components. The LAB2000 color space does not exhibit such cross contamination in this example [see Fig. 10(d)]. To verify these judgments, we conducted a psychophysical experiment on an EIZO ColorEdge CG301W liquid crystal display. The display’s white point was set to D65/2 at a luminance of 120 cd/m . It was characterized using a method proposed by
Day et al. [52]. Our observer panel comprised 13 unbiased observers with normal color vision according to the Ishihara and Farnsworth D-15 tests. Each observer performed 24 image comparisons. For each comparison, three images were shown on the display: the original image in the center of the screen and two gamut-mapped images to its left and its right. The observers were instructed to select the gamut-mapped image that was “more similar to the original.” Each of the four gamut-mapped images was compared with each of the other images once (six comparisons). The image positions (left/right) were then swapped, and the images were compared again (another six comparisons). The entire experiment was repeated once, resulting in a total of 24 decisions per observer. Each pair of gamut-mapped images was compared 52 times (4 comparisons 13 observers). A score was determined for each space depending on how often it was preferred to another space in a comparison. The outcomes of all comparisons between the LAB2000 space and the three other investigated spaces are shown in the following: LAB LAB LAB
CIELAB LAB IPT
1166
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
Fig. 11. All five color patches of the test image in (a) contain text in a color similar to the background color. The CIEDE2000 distance between the gray text and . The segmentation results for different color spaces are shown in (b)–(g). The algorithm was configured to partition the its background (first patch) is image into six segments with maximum Euclidean distance in its working color space. Different shades of gray were randomly assigned to the computed segments. (a) Test image. (b) HSL. (c) YCbCr. (d) CIELAB. (e) IPT. (f) LAB2000. (g) LAB2000 .
Taking IPT as an example, 37 of the 52 decisions (71%) were in favor of LAB2000 . Assuming a 50% chance of being selected in comparison , we can compute the probability of a space being favored 37 times (or more often) using the cumulative binomial distribution. The resulting probability is only 0.16%. This indicates that, in our experiment, gamut mapping with LAB2000 as a working space results in an image that is more similar to the original. B. Image Segmentation Image segmentation was performed using -means clustering with the Euclidean metric as a distance measure. The segmentation method is based on an example from MATLAB’s image processing toolbox [53]. The original example uses CIELAB for perception-based segmentation—we tested HSL, YCbCr, CIELAB, IPT, LAB2000, and LAB2000 . Both HSL and YCbCr are derived from the sRGB color space. In contrast to the original example, we applied k-means clustering to all three dimensions of the color space simultaneously. To create an example for uncalibrated displays, we used an image with ten different colors—five patches containing text in a color similar to the respective background color. All colors are inside the sRGB gamut. The difference between the gray text and its background is most noticeable [see Fig. 11(a)]. The algorithm was configured to partition the image into six segments. The perceived color differences between texts and backgrounds of the yellow, green, blue, and magenta patches are quite small. Therefore, most human observers would assign these colors to the same segment and choose the gray patch and the gray text as fifth and sixth segments. Ideally, a perception-based segmentation algorithm computes the same segmentation. Since the clustering method uses the Euclidean distance as a measure of color difference, the only connection to human perception is the underlying color space and its degree of perceptual uniformity. The calculation of the resulting images consisted of two steps: 1) the image was transformed into the investigated color space;
and 2) the segmentation was performed in this color space. Different shades of gray were assigned to the segments. The results are shown in Fig. 11(b)–(g). They demonstrate that HSL, YCbCr, CIELAB, and IPT lack perceptual uniformity. Colors that were almost indistinguishable were assigned to different segments, whereas clearly distinguishable colors were mapped to the same segment. If a more perceptually uniform color space was used (LAB2000 or LAB2000 ), the segmentation results reflect the expected decisions of most human observers. To verify this assumption, we conducted a psychophysical experiment using the same setup and observer panel as for the previously described gamut-mapping experiment. The image in Fig. 11(a) was presented to the observers, and they were instructed to choose the text they could “distinguish best from the background.” All 13 observers chose the gray text. VI. CONCLUSION We performed an analysis of various image processing problems related to human color perception. Our goal was to identify color-space properties that are commonly required by color image processing algorithms. Based on this analysis, we proposed a method to compute transformations from an initial color space to spaces with low cross contamination between color attributes (lightness, chroma, and hue) and maximum perceptual uniformity induced by color-difference formulas. We applied the method to CIELAB using the Hung and Berns data as a reference of constant perceived hue and the CMC, CIE94, and CIEDE2000 formulas as measures of perceptual uniformity. The perceptual uniformity of the resulting spaces was evaluated using the above color-difference formulas. Comparisons with other color spaces show that the new spaces are more perceptually uniform than HSL, YCbCr, CIELAB, and IPT. We tested the CIEDE2000-based LAB2000 space in two typical image processing scenarios. The hue linearity of the space proved beneficial in a gamut-mapping example. Hue
LISSNER AND URBAN: TOWARD A UNIFIED COLOR SPACE FOR PERCEPTION-BASED IMAGE PROCESSING
shifts of blue colors toward purple were not observed (no cross contamination between chroma and hue). The high perceptual uniformity of the space was utilized in an image segmentation example: the algorithm computed a result that agrees with human perception when using LAB2000 as a working color space. Color-space transformations from CIELAB to LAB2000 as well as the LABCMC (based on CMC) and LAB94 (based on CIE94) spaces are provided as MATLAB codes on our website [44]. APPENDIX Definition 1: Perceptual Uniformity A color space
is perceptually uniform if (27)
is the Euclidean distance and denotes where the experimentally determined color difference between and (perceived color difference). Definition 2: Predictors of Color Attributes Each point in the color space represents perceived color attributes, i.e., lightness , chroma , and hue , with respect to the viewing conditions the space is defined for. An attribute predictor is a function that calculates estimates of a specific perceived color attribute. In the following, the predictors for the perceived color attributes , , and are denoted as , , and . For example, the CIELAB attribute predictors are typically defined as follows: , , and . Definition 3: Absence of Cross Contamination The color space is free of cross contamination in the lightness, chroma, and hue components with respect to predictors , , and if, for all , , where , the following implications apply:
(28) This implies that, if we change a color such that only one predicted attribute is affected, the other perceived color attributes remain unchanged. ACKNOWLEDGMENT The authors thank the editors for their corrections that helped to improve this paper. REFERENCES [1] R. G. Kuehni, Color Space and Its Divisions, 1st ed. New York: Wiley, 2003. [2] P. G. J. Barten, Contrast Sensitivity of the Human Eye and Its Effects on Image Quality. Bellingham, WA: SPIE Press, 1999. [3] K. T. Mullen, “The contrast sensitivity of human colour vision to redgreen and blue-yellow chromatic gratings,” J. Physiol., vol. 359, no. 1, pp. 381–400, Feb. 1985.
1167
[4] J. Morovič, Color Gamut Mapping. Hoboken, NJ: Wiley, 2008. [5] L. Lucchese and S. K. Mitra, “Colour image segmentation: A state-ofthe-art survey,” Proc. Indian Nat. Sci. Acad. A, vol. 67, no. 2, pp. 207–222, 2001. [6] L. Shafarenko, H. Petrou, and J. Kittler, “Histogram-based segmentation in a perceptually uniform color space,” IEEE Trans. Image Process., vol. 7, no. 9, pp. 1354–1358, Sep. 1998. [7] G. M. Johnson and M. D. Fairchild, “The effect of opponent noise on image quality,” in Proc. SPIE/IST Electron. Imag. Conf., San Jose, CA, 2005, vol. 5668, pp. 82–89. [8] K. Huang, Z. Wu, G. S. K. Fung, and F. H. Y. Chan, “Color image denoising with wavelet thresholding based on human visual system model,” Signal Process., Image Commun., vol. 20, no. 2, pp. 115–127, Feb. 2005. [9] Y. Chen, R. S. Berns, and L. A. Taplin, “Extending printing color gamut by optimizing the spectral reflectance of inks,” in Proc. IS&T/SID 12th Color Imag. Conf., Scottsdale, AZ, 2004, pp. 163–169. [10] P. Urban and R.-R. Grigat, “Metamer density estimated color correction,” J. Signal, Image Vid. Process., vol. 3, no. 2, pp. 171–182, Jun. 2009. [11] Method for Calculation of Small Colour Differences, British Standards Institution, Tech. Rep., BS 6923, 1988. [12] Industrial Colour-Difference Evaluation Central Bur. CIE. Vienna, Austria, 1995, CIE Publication No. 116, Tech. Rep. [13] Improvement to Industrial Colour-Difference Evaluation, Central Bur. CIE. Vienna, Austria, 2001, CIE Publication No. 142, Tech. Rep. [14] E. Hering, Outlines of a Theory of the Light Sense. Cambridge, MA: Harvard Univ. Press, 1964. [15] F. Ebner and M. Fairchild, “Development and testing of a color space (IPT) with improved hue uniformity,” in Proc. IS&T/SID 6th Color Imag. Conf., Scottsdale, AZ, 1998, pp. 8–13. [16] G. M. Johnson, X. Song, E. D. Montag, and M. D. Fairchild, “Derivation of a color space for image color difference measurement,” Color Res. Appl., vol. 35, no. 6, pp. 387–400, Dec. 2010. [17] R. G. Kuehni, “Hue uniformity and the CIELAB space and color difference formula,” Color Res. Appl., vol. 23, no. 5, pp. 314–322, Oct. 1998. [18] R. S. Berns, D. H. Alman, L. Reniff, G. D. Snyder, and M. R. BalononRosen, “Visual determination of suprathreshold color-difference tolerances using probit analysis,” Color Res. Appl., vol. 16, no. 5, pp. 297–316, Oct. 1991. [19] M. R. Luo and B. Rigg, “Chromaticity-discrimination ellipses for surface colours,” Color Res. Appl., vol. 11, no. 1, pp. 25–42, Spring 1986. [20] X. Song, G. M. Johnson, and M. D. Fairchild, “Minimizing the perception of chromatic noise in digital images,” in Proc. IS&T/SID 12th Color Imag. Conf., Scottsdale, AZ, 2004, pp. 340–346. [21] S. Y. Zhu, M. R. Luo, and G. H. Cui, “New uniform color spaces,” in Proc. IS&T/SID 10th Color Imag. Conf., Scottsdale, AZ, 2002, pp. 61–65. [22] M. D. Fairchild, Color Appearance Models, 2nd ed. Reading, MA: Wiley, 2006. [23] A Colour Appearance Model for Colour Management Systems: CIECAM02, Central Bur. CIE. Vienna, Austria, 2004, CIE Publication No. 159, Tech. Rep. [24] N. Moroney, M. D. Fairchild, R. W. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 color appearance model,” in Proc. IS&T/SID 10th Color Imag. Conf., Scottsdale, AZ, 2002, pp. 23–27. [25] J. H. Xin, H. L. Shen, and C. C. Lam, “Investigation of texture effect on visual colour difference evaluation,” Color Res. Appl., vol. 30, no. 5, pp. 341–347, Oct. 2005. [26] X. Zhang and B. A. Wandell, “A spatial extension of CIELAB for digital color image reproduction,” in Proc. Soc. Inf. Disp. Symp. Tech. Dig., 1996, vol. 27, pp. 731–734. [27] M. D. Fairchild and G. M. Johnson, “The iCAM framework for image appearance, image differences, and image quality,” J. Electron. Imag., vol. 13, no. 1, pp. 126–138, Jan. 2004. [28] J. Kuang, G. M. Johnson, and M. D. Fairchild, “iCAM06: A refined image appearance model for HDR image rendering,” J. Vis. Commun. Image Represent., vol. 18, no. 5, pp. 406–414, 2007. [29] G. Johnson and M. Fairchild, “A top down description of S-CIELAB and CIEDE2000,” Color Res. Appl., vol. 28, no. 6, pp. 425–435, Dec. 2003. [30] D. B. Judd, “Ideal color space: Curvature of color space and its implications for industrial color tolerances,” Palette, vol. 29, pp. 25–31, 1968. [31] J. J. Stoker, Differential Geometry. New York: Wiley, 1969.
1168
[32] H. W. Guggenheimer, Differential Geometry. New York: McGrawHill, 1963. [33] P. Urban, D. Schleicher, M. R. Rosen, and R. S. Berns, “Embedding non-Euclidean color spaces into euclidean color spaces with minimal isometric disagreement,” J. Opt. Soc. Amer. A, Opt. Image Sci., vol. 24, no. 6, pp. 1516–1528, Jun. 2007. [34] D. L. MacAdam, “Nonlinear relations of psychometric scale values to chromaticity differences,” J. Opt. Soc. Amer., vol. 53, no. 6, pp. 754–757, Jun. 1963. [35] D. B. Judd, “Ideal color space: The super-importance of hue differences and its bearing on the geometry of color space,” Palette, vol. 30, pp. 21–28, 1968. [36] D. H. Kim and J. H. Nobbs, “New weighting functions for the weighted CIELAB colour difference formula,” in Proc. AIC, 1997, pp. 446–449. [37] K. Witt, “Geometric relations between scales of small colour differences,” Color Res. Appl., vol. 24, no. 2, pp. 78–92, Apr. 1999. [38] I. Lissner and P. Urban, “Improving color-difference formulas using visual data,” in Proc. 5th Eur. CGIV, Joensuu, Finland, 2010, pp. 483–488. [39] I. Lissner and P. Urban, “Upgrading color-difference formulas,” J. Opt. Soc. Amer. A, Opt. Image Sci., vol. 27, no. 7, pp. 1620–1629, Jul. 2010. [40] I. Lissner and P. Urban, “How perceptually uniform can a hue linear color space be?,” in Proc. 18th Color Imag. Conf., San Antonio, TX, 2010, pp. 97–102. [41] International Color Consortium (ICC), File Format for Color Profiles (Version 4.0.0), 2002. [Online]. Available: http://www.color.org. [42] P. Hung and R. S. Berns, “Determination of constant hue loci for a CRT gamut and their predictions using color appearance spaces,” Color Res. Appl., vol. 20, no. 5, pp. 285–295, Oct. 1995. [43] Parametric Effects in Colour-Difference Evaluation, Central Bur. CIE. Vienna, Austria, 1993, CIE Publication No. 101, Tech. Rep. [44] MATLAB Color Space Transformations. [Online]. Available: http:// www.idd.tu-darmstadt.de/color/papers. [45] P. A. García, R. Huertas, M. Melgosa, and G. Cui, “Measurement of the relationship between perceived and computed color differences,” J. Opt. Soc. Amer. A, Opt. Image Sci. Vis., vol. 24, no. 7, pp. 1823–1829, Jul. 2007. [46] M. Melgosa, R. Huertas, and R. S. Berns, “Performance of recent advanced color-difference formulas using the standardized residual sum of squares index,” J. Opt. Soc. Amer. A, Opt. Image Sci. Vis., vol. 25, no. 7, pp. 1828–1834, Jul. 2008. [47] M. R. Luo, G. Cui, and B. Rigg, “The development of the CIE 2000 colour-difference formula: CIEDE2000,” Color Res. Appl., vol. 26, no. 5, pp. 340–350, Oct. 2001.
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 3, MARCH 2012
[48] M. J. Vrhel and H. J. Trussell, “Problems in publishing accurate color in IEEE journals,” IEEE Trans. Image Process., vol. 11, no. 4, pp. 373–376, Apr. 2002. [49] FOGRA27: Characterization data for commercial offset printing (ISO 12647-2:2004) on gloss or matt coated paper (PT 1/2) using a screening corresponding to 60/cm, 2003. [50] M. D. Fairchild, The HDR photographic survey. [Online]. Available: http://www.cis.rit.edu/fairchild/HDR.html. [51] M. D. Fairchild, “The HDR photographic survey,” in Proc. IS&T/SID 15th Color Imag. Conf., Albuquerque, NM, 2007, pp. 233–238. [52] E. A. Day, L. Taplin, and R. S. Berns, “Colorimetric characterization of a computer-controlled liquid crystal display,” Color Res. Appl., vol. 29, no. 5, pp. 365–373, Oct. 2004. [53] “Matlab, The Language of Technical Computing, Language Reference Manual,” MathWorks, Natick, MA, 2010, 9th ed.
Ingmar Lissner received the Engineering degree in computer science and engineering from the Hamburg University of Technology, Hamburg, Germany, in 2009. He is currently working toward the Ph.D. degree with the Institute of Printing Science and Technology, Technische Universität Darmstadt, Darmstadt, Germany. He is also a Research Assistant with the Institute of Printing Science and Technology, Technische Universität Darmstadt. His research interests include color perception, uniform color spaces, and image-difference measures for color images.
Philipp Urban received the M.S. degree in mathematics from the University of Hamburg, Hamburg, Germany, and the Ph.D. degree from the Hamburg University of Technology, Hamburg. From 2006 to 2008, he was a Visiting Scientist with the Munsell Color Science Laboratory, Center for Imaging Science, Rochester Institute of Technology, Rochester, NY. Since 2009, he has been the Head of the Color Research Group, Institute of Printing Science and Technology, Technische Universität Darmstadt, Darmstadt, Germany. His research interests include spectral-based acquisition, processing, and reproduction of color images, considering the limited metameric, spectral gamut, and low dynamic range of output devices.