The histogram is of essential importance in terms of characterizing the global appearance of a given image, such as the brightness and contrast of the image. Specifically, the histogram ( is the probability of an arbitrary pixel taking the gray level , which can be approximated as:
(15) |
(16) |
Here is the code for finding the histogram of a given image img of gray levels (an 8-bit image)and of size :
for (k=0; k < glevel; k++) H[k]=h[k]=0; for (i=0; i<M; i++) for (j=0; j<N; j++) { k=img[i][j]; h[k]=h[k]+1; } H[0]=h[0]=h[0]/M/N; for (k=1; k < glevel; k++) { h[k]=h[k]/M/N; H[k]=H[k-1]+h[k]; }
Note that as the density function, the histogram satisfies:
(17) |
For a gray level image to be properly displayed on screen, its pixel values have to be within a proper range. For an 8-bit digital image there are (from 0 to 255) gray levels. However, after applying certain processing operations to the input image, the gray levels of the resulting image are no longer necessarily within the proper range for display. In this case the image needs to be normalized or rescaled:
(18) |
min=LARGE; max=-min; for (i=0; i<M; i++) for (j=0; j<N; j++) { if (img[i][j] < min) min=img[i][j]; if (img[i][j] > max) max=img[i][j]; } scale=255.0/(max-min); for (i=0; i<M; i++) for (j=0; j<N; j++) { img[i][j]=scale*(img[i][j]-min); }where is some large number (e.g., the largest floating point number representable in the computer) known to be greater than the highest pixel value.