Deep shape models
Required: two classes pixels labeling (object and background).
State-of-the-art technique to solve this problem is Markov Random Fields (MRF). It considers only local dependencies between pixels or superpixels, whereas this scheme is unable to account global constraints, which involve most part of image’s pixels, e.g. shape of the object.
ShapeBM is learned on dataset of labeled horses, which are centered and have same rotation (head to the left), because the labeled dataset is small and hard to obtain. Therefore, our model is able to segment images with horses, which have the same position.
Next, in order to widen the possible range of the images, we employ a part-based detector. Consequently applying the detector and ShapeBM+MRF allows us to segment images with diverse object positioning:
Another benefit of part – based detector is its ability to detect the positioning of parts of an object. Using affine transformation of parts’ centers, our model obtains an ability to segment horses in uncoventional poses. A detailed description of resulting model can be found on the attached slides.
In order to improve the quality of the model we explore ways to sophisticate model of shape. Recently, Multinomial Shape Boltzmann Machine (MSBM) was introduced as a more powerful model of shape. Unlike ShapeBM, MSBM takes into account objects’ parts. However, standard learning procedure for MSBM requires a training dataset with labeled objects’ parts. We proposed a new learning procedure, which requires only a binary mask (object/background) and seeds for each part of the object. The procedure uses EM algorithm with multi label segmentations as latent variables. There are two ways to extract seeds: manually and using the part-based detector. MSBM trained through the new procedure is able to generate samples only using seeds as an input.
Result of the generation can be seen below: ShapeBM+MRF_first_part, ShapeBM+MRF_second_part.
Have you spotted a typo?
Highlight it, click Ctrl+Enter and send us a message. Thank you for your help!