1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327
|
Template Matching {#tutorial_template_matching}
=================
@tableofcontents
@prev_tutorial{tutorial_back_projection}
@next_tutorial{tutorial_find_contours}
| | |
| -: | :- |
| Original author | Ana Huamán |
| Compatibility | OpenCV >= 3.0 |
Goal
----
In this tutorial you will learn how to:
- Use the OpenCV function **matchTemplate()** to search for matches between an image patch and
an input image
- Use the OpenCV function **minMaxLoc()** to find the maximum and minimum values (as well as
their positions) in a given array.
Theory
------
### What is template matching?
Template matching is a technique for finding areas of an image that match (are similar) to a
template image (patch).
While the patch must be a rectangle it may be that not all of the
rectangle is relevant. In such a case, a mask can be used to isolate the portion of the patch
that should be used to find the match.
### How does it work?
- We need two primary components:
-# **Source image (I):** The image in which we expect to find a match to the template image
-# **Template image (T):** The patch image which will be compared to the source image
our goal is to detect the highest matching area:

- To identify the matching area, we have to *compare* the template image against the source image
by sliding it:

- By **sliding**, we mean moving the patch one pixel at a time (left to right, up to down). At
each location, a metric is calculated so it represents how "good" or "bad" the match at that
location is (or how similar the patch is to that particular area of the source image).
- For each location of **T** over **I**, you *store* the metric in the *result matrix* **R**.
Each location \f$(x,y)\f$ in **R** contains the match metric:

the image above is the result **R** of sliding the patch with a metric **TM_CCORR_NORMED**.
The brightest locations indicate the highest matches. As you can see, the location marked by the
red circle is probably the one with the highest value, so that location (the rectangle formed by
that point as a corner and width and height equal to the patch image) is considered the match.
- In practice, we locate the highest value (or lower, depending of the type of matching method) in
the *R* matrix, using the function **minMaxLoc()**
### How does the mask work?
- If masking is needed for the match, three components are required:
-# **Source image (I):** The image in which we expect to find a match to the template image
-# **Template image (T):** The patch image which will be compared to the source image
-# **Mask image (M):** The mask, a grayscale image that masks the template
- Only two matching methods currently accept a mask: TM_SQDIFF and TM_CCORR_NORMED (see
below for explanation of all the matching methods available in opencv).
- The mask must have the same dimensions as the template
- The mask should have a CV_8U or CV_32F depth and the same number of channels
as the template image. In CV_8U case, the mask values are treated as binary,
i.e. zero and non-zero. In CV_32F case, the values should fall into [0..1]
range and the template pixels will be multiplied by the corresponding mask pixel
values. Since the input images in the sample have the CV_8UC3 type, the mask
is also read as color image.

### Which are the matching methods available in OpenCV?
Good question. OpenCV implements Template matching in the function **matchTemplate()**. The
available methods are 6:
-# **method=TM_SQDIFF**
\f[R(x,y)= \sum _{x',y'} (T(x',y')-I(x+x',y+y'))^2\f]
-# **method=TM_SQDIFF_NORMED**
\f[R(x,y)= \frac{\sum_{x',y'} (T(x',y')-I(x+x',y+y'))^2}{\sqrt{\sum_{x',y'}T(x',y')^2 \cdot \sum_{x',y'} I(x+x',y+y')^2}}\f]
-# **method=TM_CCORR**
\f[R(x,y)= \sum _{x',y'} (T(x',y') \cdot I(x+x',y+y'))\f]
-# **method=TM_CCORR_NORMED**
\f[R(x,y)= \frac{\sum_{x',y'} (T(x',y') \cdot I(x+x',y+y'))}{\sqrt{\sum_{x',y'}T(x',y')^2 \cdot \sum_{x',y'} I(x+x',y+y')^2}}\f]
-# **method=TM_CCOEFF**
\f[R(x,y)= \sum _{x',y'} (T'(x',y') \cdot I'(x+x',y+y'))\f]
where
\f[\begin{array}{l} T'(x',y')=T(x',y') - 1/(w \cdot h) \cdot \sum _{x'',y''} T(x'',y'') \\ I'(x+x',y+y')=I(x+x',y+y') - 1/(w \cdot h) \cdot \sum _{x'',y''} I(x+x'',y+y'') \end{array}\f]
-# **method=TM_CCOEFF_NORMED**
\f[R(x,y)= \frac{ \sum_{x',y'} (T'(x',y') \cdot I'(x+x',y+y')) }{ \sqrt{\sum_{x',y'}T'(x',y')^2 \cdot \sum_{x',y'} I'(x+x',y+y')^2} }\f]
Code
----
- **What does this program do?**
- Loads an input image, an image patch (*template*), and optionally a mask
- Perform a template matching procedure by using the OpenCV function **matchTemplate()**
with any of the 6 matching methods described before. The user can choose the method by
entering its selection in the Trackbar. If a mask is supplied, it will only be used for
the methods that support masking
- Normalize the output of the matching procedure
- Localize the location with higher matching probability
- Draw a rectangle around the area corresponding to the highest match
@add_toggle_cpp
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/4.x/samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp)
- **Code at glance:**
@include samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp
@end_toggle
@add_toggle_java
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/4.x/samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java)
- **Code at glance:**
@include samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java
@end_toggle
@add_toggle_python
- **Downloadable code**: Click
[here](https://github.com/opencv/opencv/tree/4.x/samples/python/tutorial_code/imgProc/match_template/match_template.py)
- **Code at glance:**
@include samples/python/tutorial_code/imgProc/match_template/match_template.py
@end_toggle
Explanation
-----------
- Declare some global variables, such as the image, template and result matrices, as well as the
match method and the window names:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp declare
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java declare
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py global_variables
@end_toggle
- Load the source image, template, and optionally, if supported for the matching method, a mask:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp load_image
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java load_image
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py load_image
@end_toggle
- Create the Trackbar to enter the kind of matching method to be used. When a change is detected
the callback function is called.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp create_trackbar
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java create_trackbar
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py create_trackbar
@end_toggle
- Let's check out the callback function. First, it makes a copy of the source image:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp copy_source
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java copy_source
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py copy_source
@end_toggle
- Perform the template matching operation. The arguments are naturally the input image **I**,
the template **T**, the result **R** and the match_method (given by the Trackbar),
and optionally the mask image **M**.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp match_template
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java match_template
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py match_template
@end_toggle
- We normalize the results:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp normalize
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java normalize
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py normalize
@end_toggle
- We localize the minimum and maximum values in the result matrix **R** by using **minMaxLoc()**.
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp best_match
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java best_match
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py best_match
@end_toggle
- For the first two methods ( TM_SQDIFF and MT_SQDIFF_NORMED ) the best match are the lowest
values. For all the others, higher values represent better matches. So, we save the
corresponding value in the **matchLoc** variable:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp match_loc
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java match_loc
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py match_loc
@end_toggle
- Display the source image and the result matrix. Draw a rectangle around the highest possible
matching area:
@add_toggle_cpp
@snippet samples/cpp/tutorial_code/Histograms_Matching/MatchTemplate_Demo.cpp imshow
@end_toggle
@add_toggle_java
@snippet samples/java/tutorial_code/ImgProc/tutorial_template_matching/MatchTemplateDemo.java imshow
@end_toggle
@add_toggle_python
@snippet samples/python/tutorial_code/imgProc/match_template/match_template.py imshow
@end_toggle
Results
-------
-# Testing our program with an input image such as:

and a template image:

-# Generate the following result matrices (first row are the standard methods SQDIFF, CCORR and
CCOEFF, second row are the same methods in its normalized version). In the first column, the
darkest is the better match, for the other two columns, the brighter a location, the higher the
match.






-# The right match is shown below (black rectangle around the face of the guy at the right). Notice
that CCORR and CCDEFF gave erroneous best matches, however their normalized version did it
right, this may be due to the fact that we are only considering the "highest match" and not the
other possible high matches.

|