1st International ICST Workshop on Computational Forensics

Research Article

Shoeprint Image Retrieval Based on Local Image Features

  • @INPROCEEDINGS{10.1109/IAS.2007.18,
        author={H. Su and D. Crookes and A.  Bouridane and M. Gueham},
        title={Shoeprint Image Retrieval Based on Local Image Features},
        proceedings={1st International ICST Workshop on Computational Forensics},
        keywords={Computer vision  Detectors  Footwear  Forensics  Image databases  Image retrieval  Information retrieval  Layout  Soil  Spatial databases},
  • H. Su
    D. Crookes
    A. Bouridane
    M. Gueham
    Year: 2007
    Shoeprint Image Retrieval Based on Local Image Features
    DOI: 10.1109/IAS.2007.18
H. Su1,*, D. Crookes1,*, A. Bouridane1, M. Gueham1,*
  • 1: School of EEE and CS, Queen’s University Belfast
*Contact email: h.su@qub.ac.uk, d.crookes@qub.ac.uk, mgueham02@qub.ac.uk


This paper deals with the retrieval of scene-of-crime (or scene) shoeprint images from a reference database of shoeprint images by using a new local feature detector and an improved local feature descriptor. Our approach is based on novel modifications and improvements of a few recent techniques in this area: (1) the scale adapted Harris detector, which is an extension to multi-scale domains of the Harris corner detector; (2) automatic scale selection by the characteristic scale of a local structure. (3) SIFT (Scale-Invariant Feature Transform), one of the most widely investigated descriptors in recent years. Like most of other local feature representations, the proposed approach can also be divided into two stages: (i) a set of distinctive local features are selected by first detecting scale adaptive Harris corners where each of them is associated with a scale factor, and then selecting as the final result only those corners whose scale matches the scale of blob-like structures around them. Here, the scale of a blob-like structure is detected by the Laplace-based scale selection, (ii). for each feature detected, an enhanced SIFT descriptor is computed to represent this feature. Our improvements lead two novel methods which we call the Modified Harris-Laplace (MHL) detector, and the enhanced SIFT descriptor. In this paper, we demonstrate the application of the proposed scheme to the shoeprint image retrieval problem using six sets of synthetic scene images, 50 images for each, and a database of 500 reference shoeprint images. The retrieval performance of the proposed approach is significantly better, in terms of cumulative matching score, than the existing methods investigated in this application area, such as edge directional histogram, power spectral distribution, and pattern & topological spectra.