The 2012 IEEE Change Detection Workshop (CDW-2012) is now concluded. We would like to thank everyone who contributed and participated. The workshop opening talk and challenge results and findings can be found here: CDW-2012 overview. Please note that although the 2012 edition of the workshop is concluded, the dataset and benchmarking effort are still active. Please continue to upload your latest results and participate.

Overview and call for participation

This challenge aimed to initiate a rigorous and comprehensive academic benchmarking effort for testing and ranking existing and new algorithms for change and motion detection much like the Middlebury dataset for optical flow and stereo vision. 13 algorithms were submitted to the challenge by the due date of April 15, 2012. Each algorithm was tested on the new dataset consisting of 31 real-world videos (also thermal) totalling over 80,000 frames and spanning 6 categories selected to include diverse motion and change detection challenges. The dataset was representative of indoor and outdoor visual data captured today in surveillance and smart environment scenarios, and included a comprehensive set of carefully human-annotated ground truth change/motion areas to enable a precise quantitative comparison and ranking of various algorithms. Source code was provided to compute all performance metrics reported below.

Researchers from academia and industry were invited to test their algorithms on this new dataset and submit a paper. Based on algorithm's performance and novelty, and to ensure diversity of methods and coverage of video categories, 6 teams were invited to present their results at the workshop and submit a paper to CVPR proceedings. The accepted papers are listed in the program below.

Rules of the Challenge

  • The 2012 DATASET contained 6 video categories with 4 to 6 video sequences in each category. Note that since CDW-2012 the database may have beed expanded. Results were reported for one, multiple, or all video categories but in any one category, the results were reported for all sequences in that category. Everything else being equal, submissions that had more complete results received higher consideration for acceptance.
  • Only one set of tuning parameters was supposed to be used for all videos.
  • Matlab or Python programs were made available (​UTILITIES​) to compute algorithm performance metrics.

Workshop held in conjunction with CVPR-2012
June 16, 2012
Rhode Island Convention Center, Providence, RI

Program

9:00 - 9:25 Opening talk, description of the dataset
9:25 - 9:50 M. Hofmann, P.Tiefenbacher, G.Rigoll "Background Segmentation with Feedback: The Pixel-Based Adaptive Segmenter"
9:50 - 10:15 A. Morde, X. Ma, S. Guler "Learning a background model for change detection"
10:15 - 10:45 Break
10:45 - 11:10 L. Maddalena A. Petrosino "The SOBS algorithm: what are the limits?"
11:10 - 11:35 A. Schick, M. Bäuml, R. Stiefelhagen "Improving Foreground Segmentations with Probabilistic Superpixel Markov Random Fields"
11:35 - 12:00 M. Van Droogenbroeck, O. Paquot "Background Subtraction: Experiments and Improvements for ViBe"
12:00 - 1:00 Lunch
1:00 - 1:25 Y. Nonaka, A.Shimada, H. Nagahara, R. Taniguchi "Evaluation Report of Integrated Background Modeling Based on Spatio-temporal Features"
1:25 - 1:50 Invited Speaker : Chris Stauffer, BAE systems
1:50 - 2:25 Invited Speaker : Hongcheng Wang, United Technologies Research Center
2:25 - 3:10 Panel discussion

Organizers

Results (June 16, 2012)

Results, all categories combined.

Click on method name for more details.

Method Average ranking accross categories Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 8.83 8.86 0.7882 0.9818 0.0182 0.2118 2.5642 0.7159 0.7179
GMM | KaewTraKulPong [2] 10.50 10.00 0.5072 0.9947 0.0053 0.4928 3.1051 0.5904 0.8228
ViBe [3] 10.17 11.71 0.6821 0.9830 0.0170 0.3179 3.1178 0.6683 0.7357
KDE - ElGammal [4] 10.00 12.14 0.7442 0.9757 0.0243 0.2558 3.4602 0.6719 0.6843
GMM | Stauffer & Grimson [20] 12.33 10.14 0.7108 0.9860 0.0140 0.2892 3.1037 0.6624 0.7012
GMM | Zivkovic [5] 14.50 11.43 0.6964 0.9845 0.0155 0.3036 3.1504 0.6596 0.7079
Mahalanobis distance [6] 16.17 14.14 0.7607 0.9599 0.0401 0.2393 4.6631 0.6259 0.6040
Euclidean distance [7] 17.67 14.86 0.7048 0.9692 0.0308 0.2952 4.3465 0.6111 0.6223
Local-Self similarity [8] 15.17 13.86 0.9354 0.8512 0.1488 0.0646 14.2954 0.5016 0.4139
KDE - Integrated Spatio-temporal Features [9] 9.33 9.00 0.6507 0.9932 0.0068 0.3493 2.8905 0.6418 0.7663
PSP-MRF [10] 5.50 5.86 0.8037 0.9830 0.0170 0.1963 2.3937 0.7372 0.7512
ViBe+ [11] 4.83 5.29 0.6907 0.9928 0.0072 0.3093 2.1824 0.7224 0.8318
KDE - Spatio-temporal change detection [12] 11.17 9.86 0.6576 0.9910 0.0090 0.3424 3.0022 0.6437 0.7341
GMM | RECTGAUSS-Tex [13] 14.83 14.00 0.5156 0.9862 0.0138 0.4844 3.6842 0.5221 0.7190
KNN [21] 8.50 8.00 0.6707 0.9907 0.0093 0.3293 2.7954 0.6785 0.7882
PBAS [16] 3.00 3.86 0.7840 0.9898 0.0102 0.2160 1.7693 0.7532 0.8160
Chebyshev prob. with Static Object detection [17] 7.00 6.29 0.7133 0.9888 0.0112 0.2867 2.3856 0.7001 0.7856
SC-SOBS [18] 6.67 6.71 0.8017 0.9831 0.0169 0.1983 2.4081 0.7283 0.7315
Bayesian Background [19] 11.83 14.00 0.6018 0.9826 0.0174 0.3982 3.3879 0.6272 0.7435

Results, for the baseline category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 4.29 0.9193 0.9980 0.0020 0.0807 0.4332 0.9251 0.9313
GMM | KaewTraKulPong [2] 11.43 0.5863 0.9987 0.0013 0.4137 1.9381 0.7119 0.9532
ViBe [3] 7.43 0.8204 0.9980 0.0020 0.1796 0.8869 0.8700 0.9288
KDE - ElGammal [4] 7.00 0.8969 0.9977 0.0023 0.1031 0.5499 0.9092 0.9223
GMM | Stauffer & Grimson [20] 14.86 0.8180 0.9948 0.0052 0.1820 1.5325 0.8245 0.8461
GMM | Zivkovic [5] 12.57 0.8085 0.9972 0.0028 0.1915 1.3298 0.8382 0.8993
Mahalanobis distance [6] 9.57 0.8872 0.9963 0.0037 0.1128 0.7290 0.8954 0.9071
Euclidean distance [7] 10.86 0.8385 0.9955 0.0045 0.1615 1.0260 0.8720 0.9114
Local-Self similarity [8] 12.00 0.9732 0.9865 0.0135 0.0268 1.3352 0.8494 0.7564
KDE - Integrated Spatio-temporal Features [9] 16.57 0.7472 0.9954 0.0046 0.2528 1.8058 0.7392 0.7998
PSP-MRF [10] 5.00 0.9319 0.9978 0.0022 0.0681 0.4127 0.9289 0.9261
ViBe+ [11] 8.86 0.8283 0.9974 0.0026 0.1717 0.9631 0.8715 0.9262
KDE - Spatio-temporal change detection [12] 16.86 0.7551 0.9940 0.0060 0.2449 1.9154 0.7554 0.7833
GMM | RECTGAUSS-Tex [13] 13.00 0.6669 0.9979 0.0021 0.3331 1.5342 0.7500 0.9175
KNN [21] 10.57 0.7934 0.9979 0.0021 0.2066 1.2840 0.8411 0.9245
PBAS [16] 7.57 0.9594 0.9970 0.0030 0.0406 0.4858 0.9242 0.8941
Chebyshev prob. with Static Object detection [17] 10.29 0.8266 0.9970 0.0030 0.1734 0.8304 0.8646 0.9143
SC-SOBS [18] 2.43 0.9327 0.9980 0.0020 0.0673 0.3747 0.9333 0.9341
Bayesian Background [19] 8.86 0.7327 0.9984 0.0016 0.2673 0.9037 0.8271 0.9620

Results, for the dynamic background category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 11.14 0.8798 0.9843 0.0157 0.1202 1.6367 0.6439 0.5856
GMM | KaewTraKulPong [2] 8.00 0.6303 0.9983 0.0017 0.3697 0.5405 0.6697 0.7700
ViBe [3] 14.00 0.7222 0.9896 0.0104 0.2778 1.2796 0.5652 0.5346
KDE - ElGammal [4] 14.00 0.8012 0.9856 0.0144 0.1988 1.6393 0.5961 0.5732
GMM | Stauffer & Grimson [20] 10.14 0.8344 0.9896 0.0104 0.1656 1.2083 0.6330 0.5989
GMM | Zivkovic [5] 10.86 0.8019 0.9903 0.0097 0.1981 1.1725 0.6328 0.6213
Mahalanobis distance [6] 16.00 0.8132 0.9698 0.0302 0.1868 3.1407 0.5261 0.4517
Euclidean distance [7] 17.00 0.7757 0.9714 0.0286 0.2243 3.0095 0.5081 0.4487
Local-Self similarity [8] 15.29 0.8983 0.7694 0.2306 0.1017 22.7868 0.0949 0.0518
KDE - Integrated Spatio-temporal Features [9] 9.57 0.8401 0.9908 0.0092 0.1599 1.1501 0.6016 0.5413
PSP-MRF [10] 7.86 0.8955 0.9859 0.0141 0.1045 1.4514 0.6960 0.6576
ViBe+ [11] 6.86 0.7616 0.9980 0.0020 0.2384 0.3838 0.7197 0.7291
KDE - Spatio-temporal change detection [12] 7.29 0.8935 0.9908 0.0092 0.1065 1.0142 0.6574 0.5888
GMM | RECTGAUSS-Tex [13] 17.00 0.4776 0.9838 0.0162 0.5224 1.9735 0.4296 0.6478
Chebyshev probability approach [14] 4.14 0.8182 0.9982 0.0018 0.1818 0.3436 0.7656 0.7633
Color Histogram Backprojection [15] 19.43 0.6307 0.8906 0.1094 0.3693 11.0493 0.2675 0.1980
PBAS [16] 6.71 0.6955 0.9989 0.0011 0.3045 0.5394 0.6829 0.8326
Chebyshev prob. with Static Object detection [17] 5.00 0.8182 0.9976 0.0024 0.1818 0.4086 0.7520 0.7339
SC-SOBS [18] 10.86 0.8918 0.9836 0.0164 0.1082 1.6899 0.6686 0.6283
KNN [21] 7.29 0.8047 0.9937 0.0063 0.1953 0.8059 0.6865 0.6931
Bayesian Background [19] 12.57 0.5962 0.9917 0.0083 0.4038 1.2427 0.5369 0.6898

Results, for the camera jitter category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 6.71 0.8007 0.9787 0.0213 0.1993 2.7479 0.7086 0.6399
GMM | KaewTraKulPong [2] 10.00 0.5074 0.9888 0.0112 0.4926 3.0233 0.5761 0.6897
ViBe [3] 13.43 0.7112 0.9694 0.0306 0.2888 4.0150 0.5995 0.5289
KDE - ElGammal [4] 13.14 0.7375 0.9562 0.0438 0.2625 5.1349 0.5720 0.4862
GMM | Stauffer & Grimson [20] 12.86 0.7334 0.9666 0.0334 0.2666 4.2269 0.5969 0.5126
GMM | Zivkovic [5] 15.57 0.6900 0.9665 0.0335 0.3100 4.4057 0.5670 0.4872
Mahalanobis distance [6] 15.71 0.7356 0.9431 0.0569 0.2644 6.4390 0.4960 0.3813
Euclidean distance [7] 17.29 0.7115 0.9456 0.0544 0.2885 6.2957 0.4874 0.3753
Local-Self similarity [8] 14.57 0.9764 0.6158 0.3842 0.0236 36.9570 0.2074 0.1202
KDE - Integrated Spatio-temporal Features [9] 6.43 0.7316 0.9857 0.0143 0.2684 2.4238 0.7110 0.6993
PSP-MRF [10] 3.43 0.8211 0.9825 0.0175 0.1789 2.2781 0.7502 0.7009
ViBe+ [11] 4.43 0.7293 0.9908 0.0092 0.2707 1.8473 0.7538 0.8064
KDE - Spatio-temporal change detection [12] 6.29 0.7562 0.9816 0.0184 0.2438 2.7450 0.7122 0.6793
GMM | RECTGAUSS-Tex [13] 13.43 0.7649 0.9497 0.0503 0.2351 5.6663 0.5370 0.4179
KNN [21] 8.71 0.7351 0.9778 0.0222 0.2649 3.1104 0.6894 0.7018
Color Histogram Backprojection [15] 13.86 0.4688 0.9821 0.0179 0.5312 3.7175 0.4822 0.5296
PBAS [16] 5.00 0.7373 0.9838 0.0162 0.2627 2.4882 0.7220 0.7586
Chebyshev prob. with Static Object detection [17] 11.86 0.7223 0.9725 0.0275 0.2777 3.6203 0.6416 0.5960
SC-SOBS [18] 7.43 0.8113 0.9768 0.0232 0.1887 2.8794 0.7051 0.6286
Bayesian Background [19] 9.86 0.5441 0.9886 0.0114 0.4559 2.8807 0.5988 0.6678

Results, for the intermittent object motion category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 9.43 0.7057 0.9507 0.0493 0.2943 6.1324 0.5628 0.5531
GMM | KaewTraKulPong [2] 11.14 0.3476 0.9892 0.0108 0.6524 5.9854 0.3903 0.6953
ViBe [3] 11.29 0.5122 0.9527 0.0473 0.4878 7.7432 0.5074 0.6515
KDE - ElGammal [4] 14.71 0.5035 0.9309 0.0691 0.4965 10.0695 0.4088 0.4609
GMM | Stauffer & Grimson [20] 7.29 0.5142 0.9835 0.0165 0.4858 5.1955 0.5207 0.6688
GMM | Zivkovic [5] 8.86 0.5467 0.9712 0.0288 0.4533 5.4986 0.5325 0.6458
Mahalanobis distance [6] 13.00 0.7165 0.8886 0.1114 0.2835 11.5341 0.4968 0.4535
Euclidean distance [7] 12.57 0.5919 0.9336 0.0664 0.4081 8.9975 0.4892 0.4995
Local-Self similarity [8] 12.00 0.9027 0.8222 0.1778 0.0973 15.8827 0.5329 0.4445
KDE - Integrated Spatio-temporal Features [9] 6.00 0.4512 0.9964 0.0036 0.5488 4.4191 0.5454 0.8166
PSP-MRF [10] 8.71 0.7010 0.9530 0.0470 0.2990 6.0594 0.5645 0.5727
ViBe+ [11] 8.57 0.4729 0.9820 0.0180 0.5271 5.4282 0.5093 0.7513
KDE - Spatio-temporal change detection [12] 8.00 0.4372 0.9923 0.0077 0.5628 4.6997 0.5039 0.7212
GMM | RECTGAUSS-Tex [13] 11.14 0.2190 0.9977 0.0023 0.7810 5.2547 0.3146 0.5850
KNN [21] 8.43 0.4617 0.9865 0.0135 0.5383 5.1370 0.5026 0.7121
PBAS [16] 5.57 0.6700 0.9751 0.0249 0.3300 4.2871 0.5745 0.7045
Chebyshev prob. with Static Object detection [17] 11.86 0.3570 0.9807 0.0193 0.6430 6.4700 0.3863 0.7688
SC-SOBS [18] 6.29 0.7237 0.9613 0.0387 0.2763 5.2207 0.5918 0.5896
Bayesian Background [19] 15.14 0.4813 0.9304 0.0696 0.5187 9.9632 0.4081 0.4747

Results, for the shadow category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision  Average   FPR-S 
SOBS [1] 12.14 0.8350 0.9836 0.0164 0.1650 2.3366 0.7716 0.7219 0.5689
GMM | KaewTraKulPong [2] 10.14 0.6323 0.9936 0.0064 0.3677 2.3015 0.7176 0.8577 0.4069
ViBe [3] 6.29 0.7833 0.9919 0.0081 0.2167 1.6547 0.8032 0.8342 0.5460
KDE - ElGammal [4] 8.43 0.8536 0.9885 0.0115 0.1464 1.6881 0.8028 0.7660 0.6217
GMM | Stauffer & Grimson [20] 12.57 0.7960 0.9871 0.0129 0.2040 2.1951 0.7370 0.7156 0.5352
GMM | Zivkovic [5] 13.14 0.7770 0.9878 0.0122 0.2230 2.1957 0.7319 0.7232 0.5428
Mahalanobis distance [6] 17.00 0.7845 0.9708 0.0292 0.2155 3.7896 0.6348 0.5685 0.5899
Euclidean distance [7] 15.71 0.8001 0.9783 0.0217 0.1999 2.8987 0.6785 0.6112 0.5763
Local-Self similarity [8] 14.57 0.9584 0.9442 0.0558 0.0416 5.5496 0.5951 0.4673 0.6377
KDE - Integrated Spatio-temporal Features [9] 8.29 0.7197 0.9930 0.0070 0.2803 2.1292 0.7545 0.8244 0.3901
PSP-MRF [10] 10.14 0.8736 0.9829 0.0171 0.1264 2.2414 0.7907 0.7281 0.5861
ViBe+ [11] 5.86 0.8108 0.9910 0.0090 0.1892 1.6516 0.8153 0.8302 0.5315
KDE - Spatio-temporal change detection [12] 13.86 0.6970 0.9898 0.0102 0.3030 2.4865 0.7136 0.7559 0.3977
GMM | RECTGAUSS-Tex [13] 13.14 0.7189 0.9886 0.0114 0.2811 2.4111 0.7331 0.7840 0.4764
Chebyshev probability approach [14] 5.43 0.8669 0.9887 0.0113 0.1331 1.5552 0.8333 0.8104 0.4204
KNN [21] 9.43 0.7478 0.9916 0.0084 0.2522 2.0569 0.7468 0.7788 0.3979
PBAS [16] 3.57 0.9133 0.9904 0.0096 0.0867 1.2753 0.8597 0.8143 0.5789
Chebyshev prob. with Static Object detection [17] 5.86 0.8670 0.9887 0.0113 0.1330 1.5561 0.8333 0.8103 0.4204
SC-SOBS [18] 11.57 0.8502 0.9834 0.0166 0.1498 2.3000 0.7786 0.7230 0.6035
Bayesian Background [19] 12.86 0.6537 0.9916 0.0084 0.3463 2.4695 0.6955 0.7791 0.3293

Results, for the thermal category.

Click on method name for more details.

Method Average ranking Average   Re  Average   Sp  Average   FPR  Average   FNR  Average   PWC     Average     F-Measure  Average   Precision 
SOBS [1] 11.00 0.5888 0.9956 0.0044 0.4112 2.0983 0.6834 0.8754
GMM | KaewTraKulPong [2] 11.29 0.3395 0.9993 0.0007 0.6605 4.8419 0.4767 0.9709
ViBe [3] 10.29 0.5435 0.9962 0.0038 0.4565 3.1271 0.6647 0.9363
KDE - ElGammal [4] 7.14 0.6725 0.9955 0.0045 0.3275 1.6795 0.7423 0.8974
GMM | Stauffer & Grimson [20] 13.86 0.5691 0.9946 0.0054 0.4309 4.2642 0.6621 0.8652
GMM | Zivkovic [5] 14.57 0.5542 0.9942 0.0058 0.4458 4.3002 0.6548 0.8706
Mahalanobis distance [6] 11.86 0.6270 0.9906 0.0094 0.3730 2.3462 0.7065 0.8617
Euclidean distance [7] 15.29 0.5111 0.9907 0.0093 0.4889 3.8516 0.6313 0.8877
Local-Self similarity [8] 11.00 0.9036 0.9692 0.0308 0.0964 3.2612 0.7297 0.6433
KDE - Integrated Spatio-temporal Features [9] 12.14 0.4147 0.9981 0.0019 0.5853 5.4152 0.4989 0.9164
PSP-MRF [10] 7.43 0.5991 0.9962 0.0038 0.4009 1.9189 0.6932 0.9218
ViBe+ [11] 8.57 0.5411 0.9974 0.0026 0.4589 2.8201 0.6646 0.9477
KDE - Spatio-temporal change detection [12] 13.71 0.4065 0.9973 0.0027 0.5935 5.1527 0.5199 0.8761
GMM | RECTGAUSS-Tex [13] 11.86 0.2461 0.9994 0.0006 0.7539 5.2656 0.3682 0.9619
Chebyshev probability approach [14] 5.57 0.6940 0.9962 0.0038 0.3060 1.3285 0.7259 0.8910
KNN [21] 11.71 0.4817 0.9970 0.0030 0.5183 4.3783 0.6046 0.9186
PBAS [16] 7.29 0.7283 0.9934 0.0066 0.2717 1.5398 0.7556 0.8922
Chebyshev prob. with Static Object detection [17] 5.71 0.6887 0.9963 0.0037 0.3113 1.4283 0.7230 0.8906
SC-SOBS [18] 9.57 0.6003 0.9957 0.0043 0.3997 1.9841 0.6923 0.8857
Bayesian Background [19] 10.14 0.6026 0.9952 0.0048 0.3974 2.8676 0.6969 0.8877

The ROC curves for our methods.

On the left, ROC curves with linear scale and on the right, ROC curves with logarithmic scale.

Results for methods [1, 3, 5, 21] have been obtained by the organizing committee using authors' original code. Results for methods [2, 4, 6, 7] have been obtained by the organizing committee using their own implementation or OpenCV. A 5x5 median filter has been applied in a post-processing step.

Metrics :

  • Average ranking accross categories : (rank:Baseline + rank:Dynamic Background + rank:Camera Jitter + rank:Intermittent Object Motion + rank:Shadow + rank:Thermal) / 6
  • Average ranking : (rank:Recall + rank:Spec + rank:FPR + rank:FNR + rank:PWC + rank:FMeasure + rank:Precision) / 7
  • TP : True Positive
  • FP : False Positive
  • FN : False Negative
  • TN : True Negative
  • Re (Recall) : TP / (TP + FN)
  • Sp (Specficity) : TN / (TN + FP)
  • FPR (False Positive Rate) : FP / (FP + TN)
  • FNR (False Negative Rate) : FN / (TP + FN)
  • PWC (Percentage of Wrong Classifications) : 100 * (FN + FP) / (TP + FN + FP + TN)
  • F-Measure : (2 * Precision * Recall) / (Precision + Recall)
  • Precision : TP / (TP + FP)
  • FPR-S : Average False positive rate in hard shadow areas


References :

  • 1. L. Maddalena, A. Petrosino, A Self-Organizing Approach to Background Subtraction for Visual Surveillance Applications, IEEE Transactions on Image Processing, Vol. 17, no.7, 2008, p1168-1177
  • 2. P. KaewTraKulPong and R. Bowden, "An Improved Adaptive Background Mixture Model for Real-time Tracking with Shadow Detection", in proc of Workshop on Advanced Video Based Surveillance Systems, 2001
  • 3. O. Barnich and M. Van Droogenbroeck. ViBe: A universal background subtraction algorithm for video sequences. In IEEE Transactions on Image Processing, 20(6):1709-1724, June 2011
  • 4. A. Elgammal, D. Harwood, and L. Davis, "Non-parametric model for background subtraction," in Proc. Eur. Conf. on Computer Vision, Lect. Notes Comput. Sci. 1843, 751-767 2000.
  • 5. Z. Zivkovic, "Improved adaptive Gaussian mixture model for back-ground subtraction," in Proc. Int. Conf. Pattern Recognition, pp. 28-31, IEEE, Piscataway, NJ 2004
  • 6. Y. Benezeth, P.-M. Jodoin, B. Emile, H. Laurent, and C. Rosenberger. Comparative study of background subtraction algorithms.J. of Elec. Imaging, 19(3):1–12, 2010.
  • 7. Y. Benezeth, P.-M. Jodoin, B. Emile, H. Laurent, and C. Rosenberger. Comparative study of background subtraction algorithms.J. of Elec. Imaging, 19(3):1–12, 2010.
  • 8. J-P Jodoin, G-A Bilodeau, N Saunier "Background subtraction based on Local Shape", arXiv:1204.6326v1
  • 9. Y.Nonaka, A. Shimada, H.Nagahara, R. Taniguchi "Evaluation Report of Integrated Background Modeling Based on Spatio-temporal Features", in proc of IEEE Workshop on Change Detection, 2012
  • 10. A. Schick, M.Bäuml, R.Stiefelhagen "Improving Foreground Segmentations with Probabilistic Superpixel Markov Random Fields", in proc of IEEE Workshop on Change Detection, 2012
  • 11. M. Van Droogenbroeck, O. Paquot, "Background Subtraction: Experiments and Improvements for ViBe", in proc of IEEE Workshop on Change Detection, CVPR 2012
  • 12. To be published
  • 13. Riahi, D., St-Onge, P.L., Bilodeau, G.A. (2012). RECTGAUSS-Tex: Block-based Background Subtraction, Technical Report, École Polytechnique de Montréal, EPM-RT-2012-03
  • 14. A. Morde, X. Ma, S. Guler "Learning a background model for change detection", in proc of IEEE Workshop on Change Detection, 2012 [IntuVision]
  • 15. D. Kit, B. T. Sullivan, and D. H. Ballard. Novelty detection using growing neural gas for visuo-spatial memory. In IROS, pages 1194–1200. IEEE, 2011.
  • 16. M. Hofmann, P.Tiefenbacher, G. Rigoll "Background Segmentation with Feedback: The Pixel-Based Adaptive Segmenter", in proc of IEEE Workshop on Change Detection, 2012
  • 17. A. Morde, X. Ma, S. Guler [IntuVision] "Learning a background model for change detection", in proc of IEEE Workshop on Change Detection, 2012
  • 18. L. Maddalena, A. Petrosino, "The SOBS algorithm: what are the limits?", in proc of IEEE Workshop on Change Detection, CVPR 2012
  • 19. F. Porikli and O. Tuzel. "Bayesian background modeling for foreground detection" in proc. of ACM Visual Surveillance and Sensor Network, 2005.
  • 20. C. Stauffer and W. E. L. Grimson, "Adaptive background mixture models for real-time tracking," inProc. Int. Conf. on Computer Vi-sion and Pattern Recognition, Vol. 2, IEEE, Piscataway, NJ (1999)
  • 21. Efficient adaptive density estimation per image pixel for the task of background subtraction" Z. Zivkovic , F. van der Heijden Pattern Recognition Letters, vol. 27, no. 7, pages 773-780, 2006