Aims:  
   

Document image binarization is usually performed in the preprocessing stage of different document image processing related applications such as optical character recognition (OCR) and document image retrieval. It converts a gray-scale document image into a binary document image and accordingly facilitates the ensuing tasks such as document skew estimation
and document layout analysis. As more and more text documents are scanned, fast and accurate document image binarization is becoming increasingly important.

Though document image binarization has been studied for many years, the thresholding of degraded document images is still an unsolved problem. This can be explained by the fact that the modeling of the document foreground/background is very difficult due to various types of document degradation such as uneven illumination, image contrast variation, bleeding-through, and smear. We try to develop robust and efficient document image binarization techniques which are able to produce good results for badly degraded document images.

degraded document image degraded document image

Figure 1. Two badly degraded document images

 
 
  Document Image Binarization Using Background Estimation[1]:  
   

This algorithm has achieved the top performance among the 43 submitted algorithms in the DIBCO 2009. The algorithm makes
use of the document background and the text stroke edge information. In particular, it first estimates a document background surface through an iterative polynomial smoothing procedure. The text stroke edges are then detected by combining the local image variation and the estimated document background surface. After that, the document text is segmented based on the local threshold that is estimated from the detected stroke edge pixels. At the end, a series of post-processing operations are performed to further improve the document binarization performance.

One characteristic of our proposed method is that it first estimates a document background surface through a one-dimensional iterative polynomial smoothing procedure. In particular, the estimated document background surface helps to compensate certain document degradation such as uneven illumination that frequently exists within many document images but global thresholding cannot handle properly because of the lack of a bimodal histogram pattern. At the same time, the proposed method makes use of the text stroke edges to estimate the local threshold. The use of the text stroke edges overcomes the limitations of many existing adaptive thresholding methods such as those windowbased methods that rely heavily on the window size as well as many other complex document thresholding methods that estimates the local threshold by combining different types of image information.

The paper has been submitted to IJDAR..

background estimation

Figure 2. Document Background Estimation through iterative polynomial smoothing. (a) The intensity of one image row(blur graph) and the fitted initial smoothing polynomial(black bold graph); (b) The final smoothing polynomial(back bold graph) after multiple round of smoothing of the image row.

 
 
  Document Image Binarization Using Local Maximum and Minimum[2]:  
   

The technique makes use of the image contrast that is defined by the local image maximum and minimum. Compared with the image gradient, the image contrast evaluated by the local maximum and minimum has a nice property that it is more tolerant to the uneven illumination and other types of document degradation such as smear. The technique has been tested over the dataset that is used in the recent Document Image Binarization Contest (DIBCO) 2009. Experiments show its superior performance.

Compared with the image gradient, such image contrast is more capable of detecting the high contrast image pixels (lying around the text stroke boundary) from historical documents that often suffer from different types of document degradation. And compared with the Lu&Tan’s method which was used in DIBCO contest, the method is better while handling document images with complex background variation. Given a historical document image, the technique first determines a contrast image based on the local maximum and minimum. The high contrast image pixels around the text stroke boundary are then detected through the global thresholding of the determined contrast image. Lastly, the historical document image is binarized based on the local thresholds that are estimated from the detected high contrast image pixels. Compare with previous method based on image contrast, the method uses the image contrast to identify the text stroke boundary, which can be used to produce more accurate binarization results.

The paper has appeared in DAS2010.

image contrast v.s. image gradient

Figure 3. (a) The traditional image gradient that is obtained using Canny’s edge detector; (b) The image contrast that is obtained by using the local maximum and minimum;(c) One column of the image gradient in (a) (shown as a vertical white line);(d) The same column of the contrast image in (b).

 
 
  Binarization Results on DIBCO2009 dataset:  
   

Here shows some binarization results of the test images in DIBCO2009 dataset. The full test image and groundtruth image of DIBCO2009 dataset can be downloaded here. And all the binarization results of our background estimation method and local maximum and minimum method can be downloaded here.

(a1)
(a2)
(b1) (b2)
(c1)
(c2)

Figure 4. (a) Input images; (b) the binary images constructed by Background Estimate; (c) the binary image constructed by Local Maximum and Minimum.

And Table 1 compares the performance of the two methods evaluated by four metrics: F-Measure, PSNR, NRM, MPM( The detailed description of the four metrics can be found here) under the DIBCO 2009 dataset.

Table 1. Evaluation results of Background Estimation and Local Maximum Minimum methods
Method
F-Measure(%) PSNR NRM(*10^-2) MPM(*10^-3)
Background Estimation
88.53
19.42
5.11
0.32
Local Maximum Minimum
89.93
19.94
6.69
0.3


 
 
  Publications:  
   

[1] S Lu, Su B and C L Tan, Document Image Binarization Using Background Estimation, submitted to IJDAR.

[2] Su B, S Lu and C L Tan, Binarization of historical document images using the local maximum and minimum, International Workshop on Document Analysis Systems, 9-11 June 2010, Boston, MA, USA.