0% found this document useful (0 votes)
40 views

Exemplar-Based Inpainting Using Local Binary Patterns

Inpainting with local binary pattern
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Exemplar-Based Inpainting Using Local Binary Patterns

Inpainting with local binary pattern
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Exemplar-based inpainting using local binary patterns

V.V. Voronina, V.I. Marchuka, N.V. Gapona, R.A. Sizyakina, A.I. Sherstobitova and K.O. Egiazarianb
a
Dept. of Radio-Electronics Systems, Don State Technical University, Shakhty, Russian Federation
b
Dept. of Signal Processing, Tampere University of Technology, Tampere, Finland

ABSTRACT

This paper focuses on novel image reconstruction method based on modified exemplar-based technique. The basic idea
is to find an example (patch) from an image using local binary patterns, and replacing non-existed (‘lost’) data with it.
We propose to use multiple criteria for a patch similarity search since often in practice existed exemplar-based methods
produce unsatisfactory results. The criteria for searching the best matching uses several terms, including Euclidean
metric for pixel brightness and Chi-squared histogram matching distance for local binary patterns. A combined use of
textural geometric characteristics together with color information allows to get more informative description of the
patches. Texture synthesis method proposed by Efros and Freeman for patch restoration is utilized in the proposed
method. It allows optimizing an overlap region between patches using minimum error boundary cut. Several examples
considered in this paper show the effectiveness of the proposed approach for large objects removal as well as recovery of
small regions on several test images.

Keywords: image inpainting, interpolation, exemplar-based method, local binary pattern, texture synthesis.

1. INTRODUCTION
Image inpainting, and, in general, image reconstruction are very important problems in image processing. An ultimate
goal of image inpainting is to restore a missing (damaged) area of “empty” pixels regions in a visually plausible manner
using information outside of the damaged domain. An automatic restoration of damaged or missing pixels is an image
reconstruction problem coming from many practical applications, such as retouching of digital photographs, restoring
images, image coding, computer vision, etc. This technique can be used in restoring old photographs or damaged films.
It can also remove superimposed text like dates, subtitles etc. or even unwanted objects from the image.
Most of image reconstruction methods can be classified into the following groups based on geometry [1-8], statistics [9-
14], sparsity [15-21], edges [22-27] and exemplars [28-37] methods.
The geometry-based (also called diffusion-based) methods compute an optimal solution based on partial differential
equations (PDE). In these methods missing pixels are restored by a smooth propagation of information from the
neighborhood of the holes by a diffusion process. The drawback of these methods is that they often introduce a blur in
sharp transitions in image and image contours, and need a priori information for parameter selection. These methods are
suitable for removal of scratches and small defects on the structure of images, but often fail to restore the texture image
and curved contours.
Statistical-based methods use Markov random field, probabilistic techniques and texture synthesis. These algorithms are
more suitable for a stochastic texture reconstruction and often fail in structure recovery from boundary data.
Sparsity-based methods use a frequency domain processing and a sparse representation model. It should be noted that
they often tend to blur texture and structure in a recovery of large areas of missing pixels. These iterative methods are
also computationally expensive.
Recently, various image inpainting methods based on combinational edge propagation and texture reconstruction have
been developed. They show high quality results, but in many cases algorithms are more suitable for a structure and
texture reconstruction, but require a significant computational time to inpaint large missing regions in images.
The exemplar-based methods use a non-parametric sampling model and texture synthesis. Criminisi et al. [28] have
proposed a patch-based method based on searching similar patches and coping them from the true image. These methods
are limited to inpaint linear structures and often fail to recover curvy boundary edges. Often an image has not enough

Image Processing: Algorithms and Systems XII, edited by Karen O. Egiazarian, Sos S. Agaian,
Atanas P. Gotchev, Proc. of SPIE-IS&T Electronic Imaging, SPIE Vol. 9019, 901907
© 2014 SPIE-IS&T · CCC code: 0277-786X/14/$18 · doi: 10.1117/12.2038556

SPIE-IS&T/ Vol. 9019 901907-1

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


patches to copy from because the patch size is large or the mask is placed on a singular location on the image where
cannot be found similar patches. The problem of choosing distance and space for similarity patches is a common one for
all exemplar-based inpainting methods. We will tackle this problem by optimizing the search in the patch texture space.
Summarizing, the main drawbacks of the known methods come from the fact that the most of them are unable to recover
the curved edges and applicable only for scratches and small defects removal. It should be also noted that these methods
often blur image in the recovery of large areas with missing pixels. Most of these methods are computationally very
demanding and inappropriate for implementation on modern mobile platforms.
In this paper we introduce a novel algorithm for automatic image inpainting based on a modified exemplar-based
technique. A key idea is to find examples (patches) from an image using multiple criteria for searching procedure using
several terms, including the Euclidean metric for pixel brightness and Chi-squared histogram matching distance for local
binary patterns. For patch restoration the texture synthesis method proposed by Efros and Freeman [13] has been used
instead of copy-and-paste procedure. This allows optimizing an overlap region between patches using minimum error
boundary cut.
The rest of the paper is organized as follows. Section 2 describes the proposed method. This is followed by a description
of the basic idea of the proposed approach to find patches from the true image using color and texture descriptors.
Section 3 presents key steps of the proposed algorithm. Experimental results are given in Section 4, followed by the
Conclusion section.

2. THE PROPOSED METHOD


2.1. An image model
A simplified mathematical model of the original image can be represented as follows:

Yi, j  (1  M i, j )  Si, j  M i, j  Ri, j , i  1, I , j  1, J , (1)

where S i , j are ‘true’ image pixels; M  M i, j  is a binary mask of distorted values of pixels (1 - corresponds to the missing
pixels, 0 - to the true pixels); Ri , j are missing pixels; I is the number of rows, and J is the number of columns.

In this paper we use the following notations: A is a number of elements of a finite set A , Z is a set of integers; R is
the set of real numbers; R 2 is the set of points of the coordinate plane; x=(i,j) Z 2 is the set of points R 2 with integer
coordinates. Image is a set on the plane R 2 where the pixels Y(x) Z 2 are regarded as constituting the image data.
Formally, we define a function S(x) in domain Z 2 , which can take only integer values in the interval [0;255] dim : in
case when image represents by gray value intensity dim  1 , and for color values dim  3 . The value S(x) is an estimate
of the brightness of a pixel Y(x) in the image.

Figure 1 shows the image model, where the region Y is schematically presented in the form of two sub-regions,
representing two types of texture regions, S is the boundary of the recovery region, R are missing pixels.

Si,j
Si,j
S
Ri,j
Figure 1. The images model.

SPIE-IS&T/ Vol. 9019 901907-2

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


2.2. The proposed method
The proposed algorithm is a modification of the example-based image inpainting algorithm proposed by Criminisi et al.
[28]. The exemplar-based method (EBM) uses a priority function to select the best patch and copy it to fill-in regions.
Such an approach allows propagating an edge and a structure in the first instance.
The main drawbacks of EBM include:
- visible boundaries on the reconstructed image between similar patches;
- an incorrect restoration in absence of similar blocks;
- a dependence of reconstruction error on a block size.
In [31] we have proposed an algorithm which overcomes some of these drawbacks. This modification of exemplar-based
method allows choosing sub-optimally image-adaptive form and size of the block in order to find similar patches,
number of which is further increased by rotation of these blocks.
One of the major problems in original inpainting method is a process of searching the patch with the maximum similarity
to a selected patch using mean squared error metric. In Figure 2 we draw an example when the patch  q (1) is closer to the
patch  p (in mean squared sense) than  q ( 2 ) . As a result, the algorithm will produce visually poor result (the metric
defined only for the true pixel in  p ). Thus, the criterion to search the best match, such as the Euclidean metric for
pixel brightness, may lead for some images to uncorrected reconstruction since a method ignores the textural
characteristics of the patches.

 q(1)

p

 q(2)

Figure 2. The incorrect restoration.

The purpose of this work is to modify an exemplar-based method in order to overcome above mentioned drawbacks of
EBM.

The pixels belonging to the boundary of the recovery region will be denoted by Si , j , where: i  1, N , j  1, M .

At the first step of the algorithm, for each pixel boundary S we choose a square block in order to find similar patches.

At the second step, a priority value P(S ) is computed for each value of the pixel boundary, which consists of two
factors (see Figure 3):
P(S )  С (S )  D(S ) , (2)

l C (l ) I S  nS


C (S )  S
, D(S )  , (3)
S 

SPIE-IS&T/ Vol. 9019 901907-3

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


where Si , j is the current pixel on the boundary of true pixels; С (S ) is the confidence term; D(S ) is the data term;
S is the adaptive patch of pixels centered at the pixel Si , j ; S is the number of pixels in the adaptive patch S ;
I S is a vector orthogonal to the gradient at the point Si , j ; nS is a vector orthogonal to the boundary S at the point
Si , j ; and  is a normalization factor (   255 for a typical grey-level image)).

The value of the confidence term С for the pixels of the true region S is 1 and for the missing region  is 0.

S
S
p S
p

IS nS
R
Figure 3. The orthogonal vectors

Calculating the priority using the expression (2) allows giving larger weights to pixels that are on the differences of
brightness (the boundaries), thus restoring them first. The confidence term sets lower weights to the restored pixels with
increasing distance from the true pixels of the region S .
Let us define the pixel p  (i, j ) with the maximal priority max( P(Si, j )) on the border S and the patch  p with
color intensity change (Figure 3). On the true image S we find patch  q , for which the Euclidean metric is minimal:

DE ( p , q )   ( p  q ) 2 
 min . (4)

We propose to modify an exemplar-based method to search examples (i.e. patches) from image using multiple searching
criteria, where the Euclidean metric DE ( p , q ) is the first one.

At the next step, we have utilized local binary patterns (LBP) [39] as a texture descriptor for patches in image. LBP gives
a subset of uniform patterns which combine texture and structure properties and is invariant to rotation and to a change of
gray scale.
LBP operator is calculated by thresholding each pixel around the central pixel in a local 3x3 pattern, transforming it into a
8-bit binary code (see Fig. 4).

g1 g2 g3 0 0 0

Thresholding
g4 g0 g5 1 0

g6 g7 g8 0 1 1

Figure 4. LBP operator.

The dimensionality of the LBP distribution can be effectively decreased by reducing the number of a subset of bins. Ojala
et al. showed that most of the local binary patterns in natural images are uniform. The modified version of LBP proposed

SPIE-IS&T/ Vol. 9019 901907-4

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


in [40], uses uniformity measure, defined as a number of spatial transitions between 1 and 0 in the pattern. In many cases,
uniform binary patterns correspond to primitive image features such as edges, corners and spots.
The modified uniform LBP (Fig.5) can be written as follows:

P
 f ( g p  g 0 ) if U  U T f ( x)  1, x  0 ,
LBPP , K   p 1 ,  (5)
P  1 otherwise 0, x  0

where P is a number of neighbors; K is a radius of neighbors; U is a number of spatial transitions between 1 and 0;
U T is a uniformity measure.
Among all possible 256 binary patterns formed by LBP operator, only 9 uniform patterns (depicted in Figure 5) are used
for analysis. Therefore, the formed histogram of uniform patterns in a local region presents a powerful descriptor to
analyze the local regions in image. The proposed approach can be used in multi-scale analysis for patches based on LBP
with different radius. The histogram of brightness in this case is not suitable, since different images (see Figure 6) may
have similar histograms.

0 1 2 3 4 5 6 7 8

Figure 5. The uniform LBP.

Figure 6. The histogram of brightness

The histogram is formed for every local area  p , q and used as a feature descriptor. To determine the correspondence
between the histograms, several distance measures have been proposed in [40]:
- the histogram intersection (assuming normalized histograms):

 min(h 
K
HistInt (hp , hq )  1  p (m), hq (m)) , (6)
m 1

- the Bhattacharyya measure:


K
Bh(hp , hq )  
m 1
h p (m)  hq (m) , (7)

- the Chi-squared histogram matching distance:

SPIE-IS&T/ Vol. 9019 901907-5

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


K h 
 p (m)  hq ( m)
2


1
 2 (hp , hq )   , (8)
2 m 1 hp (m)  hq (m)

where h p and hq are the histogram of LBP for  p and q patches, respectively.

In this paper for histogram matching we have chosen Chi-squared distance, since it works better in the case when nearby
bins represent similar values.
For the multiple criteria, we have used Euclidean metric for pixel brightness and Chi-squared histogram matching
distance for local binary patterns in the form:

F  (1   )  DE (p , q )     2 (hp , hq ) 


 min. (9)

where  is a weight coefficient (   0,25 for a typical image).

At the third step of the algorithm, we find a patch  q on the true image S , for which the multiple criteria F is minimal
(see Figure 7).

q (2)

(2)
q
S q (1 )
(1 )
q

p
S p
S
R
Figure 7. Searching similar patches

To reduce the visibility boundaries on the reconstructed image between similar patches in EBM we use the algorithm for
texture synthesis proposed by Efros and Freeman instead of simple copy-and-paste procedure. It allows optimizing an
overlap region between patches using minimum error boundary cut (Figure 8) [13].

 p  q  p
 q

Copy-and-past Minimal error cut

Figure 8. Minimum error boundary cut

The confidence data for the restored pixels is assigned to the current value C ( p) . After this a procedure of calculation of
priority and searching for similar patches with subsequent replacement repeats.

3. INPAINTING ALGORITHM
We summarize the algorithm in the following scheme (Figure 9).

SPIE-IS&T/ Vol. 9019 901907-6

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


Input image and mask

Calculating the priority

Defining a pixel with


maximum priority Calculating the LBP

Defining a restored patch Forming a histogram

Calculating Euclidean metrics for Calculating the Chi-squared


true pixels and restored patch histogram matching

Searching similar patch


using multiple criteria

Patch restoring using


minimum error boundary cut
Repeat this
block for all
restoring pixels

The restored image

Figure 9. The proposed inpainting algorithm.

Below we will describe all building blocks of the proposed algorithm.


1. Calculating the priority
A priority value is computed for each value of the pixel boundary consisting of the confidence term and the data term.
Calculating the priority using the expression (2) allows giving larger weights to pixels that are on the differences of
brightness (the boundaries), thus restoring them first. The confidence term sets lower weights to the restored pixels with
increasing distance from the true pixels of the region.
2. Defining a pixel with a maximum priority
Define a pixel with a maximum priority on the border of the missing region.
3. Defining a restored patch
Define the square patch Ψ p with a center in the pixel with a maximum priority.

4. Calculating Euclidean metrics for true pixels and restored patch

SPIE-IS&T/ Vol. 9019 901907-7

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


On the true image S we find a patch  q , for which the Euclidean metric is minimal  ( p  q ) 2 
 min .

5. Calculating the LBP


We have utilized the local binary patterns (LBP) as a texture descriptor for patches in image. The basic LBP operator is
calculated by a thresholding of each pixel around the central pixel in a local 3x3 pattern.
6. Forming a histogram
Among all possible binary patterns formed by LBP operator, only 9 uniform patterns are used for analysis. Therefore, the
histogram of the uniform patterns in a local region forms a powerful descriptor used to analyze the local regions in
image. The histogram in our scheme is formed for every local area  p , q and used as a feature descriptor.

7. Calculating the Chi-squared histogram matching


To determine the correspondence between the histograms, we calculate the Chi-squared histogram matching for LBP of
the true pixels and restored patch.
8. Searching similar patch using multiple criteria
It is proposed to use a multiple criteria to search a similar patch. This allows finding the best match using several
normalized terms, such as the Euclidean metric for pixel brightness and Chi-squared histogram matching distance for local
binary patterns.
9. Patch restoration using minimum error boundary cut
For patch restoration we use the algorithm for texture synthesis which allows optimizing an overlap region between
patches using minimum error boundary cut. The confidence data for the restored pixels is assigned to the current value
C ( p) . After this a procedure of calculation of priority and searching for similar patches with subsequent replacement
repeats.

4. EXPERIMENTAL RESULTS
A proper selection of the coefficient  in multiple criteria is a very important step in the proposed method, since it
determines its effectiveness. This weighting factor can vary from 0 to 1. If   0 then as a measure of proximity for
searching a similar patch only Euclidean metric is used, and if   1 then we use Chi-squared histogram matching
distance for local binary patterns. It is interesting to study the priority of one over the other criteria in the processing of
test images to select the best values of the weight coefficient  . For this purpose, an experiment was conducted for the
test images from two different databases (textures and real images). The image resolution is 600 by 600 pixels. Mask of
missing pixels is modeled by applying a uniform grid of blocks of size 30 by 30 pixels. A quality of reconstructed image

 S 2
I J
is estimated by RMSE, where: RMSE  i, j  Si , j IJ , i  1, I , j  1, J . Figure 10 shows an average
i 1 j 1

dependence of RMSE ( ) for each group of test images.

Analysis of the results presented in Figure 10 shows that the error for textures is less than the error for real images
because texture images have self-similar local regions which can be effectively reconstructed by exemplar-based
methods. The values of the RMSE initially decreases with  changing from 0 to 0.3 and reaches a local minimum, after
that increases with  changing to 1 and reaches its maximum. These experimental results demonstrate that the modified
exemplar-based method may reduce the error of image reconstruction in average by 18% when the weight coefficient is
set to   0,25 .

SPIE-IS&T/ Vol. 9019 901907-8

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


RMSE (  )
30

28

26

24

22

20

18

16

14

12

10
0 0,1 0,2 0,3 0,4 0,5 0,6 0,7 0,8 0,9 1 
Textures Images

Figure 10. Dependence of RMSE ( ) .

In Figure 11 examples of image restoration from two databases (a,e - original images, b,f - images with a missing pixels,
c,g - images reconstructed by the EBM, d,h - images reconstructed by the proposed method) are given. The results show
that the proposed method can more correctly restore the structure and texture regions.
5
0:

Ri

1Z
1-6:',

W,

Eh
L !n
3-J

1
=7:::

'.="."-
11,:.

ff;
,.;,
d
1:;I

!u;
-:::1

fli
:...,.

a) b) c) d)
4
,.

.--o
--
air-
1
IN,,
.
:.

e) f) g) h)

Figure 11. Restoration of the test images (a,e - original images, b,f - images with a missing pixels, c,g - images reconstructed by the
EBM, d,h - images reconstructed by the proposed method)

Figure 12 presents the results of object removal from an image (a,e - original images, b,f - images with a mask to remove
the object, c,g - images reconstructed by the EBM, d,h - the images reconstructed by the proposed method).
The boundary pixel values of the objects of the images are correctly restored using the proposed method. It's also worth
noting that the method does not smear the texture and structure during the restoration of large areas of missing pixels. In
contrary, the EBM method results in visible distortions of the background area.

SPIE-IS&T/ Vol. 9019 901907-9

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


a) b) c) d)

rt'

e)
sif) g) h)
Figure 12. Object removal on the test images (a,e - original images, b,f - images with a mask to remove the object, c,g - images
reconstructed by the EBM, d,h - the images reconstructed by the proposed method).

CONCLUSION
This paper presented an image inpainting algorithm based on a modified exemplar-based technique. It is proposed to use
a multiple criteria to search a similar patch which uses two terms: the Euclidean metric for pixel brightness and Chi-
squared histogram matching distance for local binary patterns. A use of textural geometric characteristics along with the
color information resulted in better description of the patches. For patch restoration the texture synthesis method
proposed by Efros and Freeman has been used with the purpose of optimizing an overlap region between patches using
minimum error boundary cut. Several examples presented in this paper demonstrated the effectiveness of the algorithm in
removing selected objects from the image. The proposed technique is able to produce visually preferable results in
reconstruction of missing small and large objects on test images than the exemplar-based method.

REFERENCES

[1] Oliveira, M.M., Bowen, B., McKenna, R., and Chang, Y.-S., “Fast digital inpainting,” in Proc. Int. Conf. Vis.,
Imaging Image Process., Sep., 261–266 (2001).
[2] Masnou, S. and Morel, J.-M., “Level lines based disoclussion,” in Proc. Int. Conf. Image Process., 259–263
(1998).
[3] Ballester, C., Bertalmio, M., Caselles, V., Sapiro, G., and Verdera, J., “Filling-in by join interpolation of vector
fields and grey levels,” IEEE Trans. Image Process., vol. 10, no. 8, 1200–1211 (2001).
[4] Chan, T.F. and Shen, J., “Mathematical models for local non-texture inpainting,” SIAM J. Appl. Math., vol. 62,
no. 3, 1019–1043 (2001).
[5] Rudin, L., Osher, S., and Fatemi, E., “Nonlinear total variation based noise removal algorithms,” Physica D,
vol. 60, 259–268 (1998).
[6] Shen, J., “Inpainting and the fundamental problem of image processing,” SIAM News, vol. 36, no. 5, 1–4 (2003).
[7] Bertalmio, M., Sapiro, G., Caselles, V., and Ballester, C., “Image inpainting,” in Proc. Comput. Graph.
(SIGGRAPH2000), 417–424 (2000).

SPIE-IS&T/ Vol. 9019 901907-10

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


[8] Bertalmio, M., Bertozzi, A.L., and Sapiro, G., “Navier–Stokes, fluid dynamics, and image and video
inpainting,” Proc. IEEE Conf.Comput. Vis. Pattern Recognition, 355-362 (2001).
[9] Paget, Rupert and Longstaff, I. D., “Texture Synthesis via a Noncausal Nonparametric Multiscale Markov
Random Field,” IEEE Transactions on Image Processing, vol. 7, no. 6 (1998).
[10] Shen, Huanfeng, Ai, Tinghua, Li, Pingxiang, “A MAP-Based Algorithm for Destriping and Inpainting of
Remotely Sensed Images,” Geoscience and Remote Sensing, IEEE Transactions on, Volume:47, Issue: 5, 1492
– 1502 (2009).
[11] Grim, Jiri, Somo, Petr, Pudil, Pavel, Mґikova, Irena, Malec, Miroslav, “Texture oriented image inpainting
based on local statistical model,” Proceedings of the 10th IASTED International Conference Signal and Image
Processing (SIP 2008), 18-20 (2008).
[12] Chan, T. F. and Shen, J., “Mathematical Models for Local Non Texture Inpaintings,” SIAM Journal on Applied
Mathematics 62(3):1019–1043 (2002).
[13] Efros, Alexei A., Freeman, William T., “Image quilting for texture synthesis and transfer,” Proceeding
SIGGRAPH '01 Proceedings of the 28th annual conference on Computer graphics and interactive techniques,
341-346 (2001).
[14] Wei, Li-Yi, Levoy, Marc, “Fast texture synthesis using tree-structured vector quantization,” Proceeding
SIGGRAPH '00 Proceedings of the 27th annual conference on Computer graphics and interactive techniques,
479-488 (2000).
[15] Hirani, A. and Totsuka, T., “Combining Frequency and spatial domain information for fast interactive image
noise removal,” Computer Graphics, SIGGRAPH96, 269-276 (1996).
[16] Starck, J.L., Elad, M., and Donoho, D.L., "Image decomposition via the combination of sparse representations
and a variational approach", the IEEE Trans. On Image Processing, Vol. 14, No. 10, 1570-1582, (2005).
[17] Starck, J.-L., Elad, M., and Donoho, D.L., "Redundant Multiscale Transforms and their Application for
Morphological Component Analysis", the Journal of Advances in Imaging and Electron Physics, Vol. 132, 287-
348 (2004).
[18] Fadili, M.J., Starck, J.-L., Murtagh, F., “Inpainting and Zooming using Sparse Representations,” The Computer
Journal, 52 (1): 64-79 (2009).
[19] Xin, Li, “Image Recovery via Hybrid Sparse Representations: a Deterministic Annealing Approach,” Selected
Topics in Signal Processing, IEEE Journal of , (Volume:5 , Issue: 5 ), 953 – 962 (2011).
[20] Koh, Min-Sung, Rodriguez-Marek, E., “Turbo inpainting: Iterative K-SVD with a new dictionary,” Multimedia
Signal Processing, 2009. MMSP '09. IEEE International Workshop on, 1 – 6 (2009).
[21] Guleryuz, O.G. , "Nonlinear approximation based image recovery using adaptive sparse reconstructions and
iterated denoising," Part I: theory, IEEE Transactions on Image Processing 15(3), (2006).
[22] Frederic, Cao, Yann, Gousseau, Masnou, Simon, and Patrick, Perez, “Geometrically Guided Exemplar-Based
Inpainting,” SIAM J. Imaging Sciences, vol. 4, No. 4, 1143–1179 (2011).
[23] Dong, Liu, Xiaoyan, Sun, Feng, Wu, “Intra Prediction via Edge-Based Inpainting,” Proceeding DCC
'08 Proceedings of the Data Compression Conference, 282-291, (2008).
[24] Yan, Chen, Qing, Luan, Li, Houqiang, Au, O., “Sketch-Guided Texture-Based Image Inpainting,” Image
Processing, 2006 IEEE International Conference on, 1997 – 2000 (2006).
[25] Jason C. Hung, Chun-Hong Hwang, Yi-Chun Liao, Nick C. Tang, Ta-Jen Chen, “Exemplar-based Image
Inpainting base on Structure Construction,” Journal of Software, vol. 3, no. 8, 53-64 (2008).
[26] Saito, Takahiro, Ishii, Yuki, Nakagawa, Yousuke, and Komatsu, Takashi, “Adaptable image interpolation with
skeleton - texture separation,” Image Processing, 2006 IEEE International Conference on, 681 – 684 (2006).
[27] Voronin, V.V., Frantc, V.A., Marchuk, V.I., Egiazarian, K.O., “Fast texture and structure image reconstruction
using the perceptual hash,” Image Processing: Algorithms and Systems XI; and Parallel Processing for Imaging
Applications II, edited by Karen O. Egiazarian, Sos S. Agaian, Atanas P. Gotchev, Proceedings of SPIE Vol.
8655 (SPIE, Bellingham, WA 2013) 86550V.
[28] Criminisi, A., Perez, P., and K.Toyama, “Region filling and object removal by exemplar-based image
inpainting,” IEEE Transactions on Image Processing 13, 1200–1212 (2004).
[29] Yongbo, Qinm, Feng, Wang, “A curvature constraint Exemplar-based image inpainting,” Image Analysis and
Signal Processing (IASP), 2010 International Conference on, 263 – 267 (2010).
[30] Meur, Le, Gautier O.J., and Guillemot, C., “Examplar-based inpainting based on local geometry,” in ICIP,
3462–3465 (2011).

SPIE-IS&T/ Vol. 9019 901907-11

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms


[31] Voronin, V. V., Marchuk, V. I., and Egiazarian, K. O., “Images reconstruction using modified exemplar based
method,” in SPIE Electronic Imaging (2011)], 7870 (2011).
[32] Wong, A. and Orchard, J., “A nonlocal-means approach to exemplar-based inpainting,” in ICIP, pp. 2600–2603
(2008).
[33] Aujol, J.-F., Ladjal, S., and Masnou, S., “Exemplar-based inpainting from a variational point of view,” SIAM J.
Math. Anal., 42(3):1246–85 (2010).
[34] Ružić, T., and Pižurica, A., "Texture and color descriptors as a tool for context-aware patch-based image
inpainting," in SPIE Electronic Imaging, Vol. 8295 (2012).
[35] Yunqiang, Liu and Vicent, Caselles, “Exemplar-Based Image Inpainting Using Multiscale Graph Cuts,” IEEE
Transactions on Image Processing, vol. 22, no. 5, 1699-1711 (2013).
[36] Arias, P., Facciolo, G., Caselles, V., and Sapiro, G., “A variational framework for exemplar-based image
inpainting,” International Journal of Computer Vision, 319–347 (2011).
[37] Barnes, C., Shechtman, E., Finkelstein, A., and Goldman, D.B., “Patch Match: a randomized correspondence
algorithm for structural image editing,” In Proc. of SIGGRAPH, New York, NY, USA, pp. 1–11 (2009).
[38] Varma, M., Zisserman, A., “A statistical approach to material classification using image patch exemplars,”
IEEE Transactions on Pattern Analysis and Machine Intelligence (2009)
[39] Ojala, T., Pietikäinen, M., Mäenpää, T., “Multiresolution gray scale and rotation invariant texture analysis with
Local Binary Patterns,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, 971–
987 (2002).
[40] Rubner, Y., Puzicha, J., Tomasi, C. and Buhmann J.M., “Empirical Evaluation of Dissimilarity Measures for
Color and Texture,” Computer Vision and Image Understanding 84, 25–43 (2001).

SPIE-IS&T/ Vol. 9019 901907-12

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 05/15/2015 Terms of Use: http://spiedl.org/terms

You might also like