Demos and Resources

For brief guide on algorithm applicability (+some source codes) click here.

Feature Selection Algorithms - Demos

P. Somol, P. Pudil

(the demonstrations require Flash plug-in (version 4+))

Floating Search principle explanation

This presentation explains the principle of the classical Floating subset search algorithm in its backward form. Suppose our task is to select a subset of items out of the total number of 6 items (features) so as to maximize an adopted criterion function. The Floating search algorithm finds solutions for all subset sizes in one run. For full explanation see [5,7].
Track the algorithm by clicking the green right-arrow:
 



(Fast) Branch & Bound basic principle explanation

This presentation explains the principle of the classical Branch & Bound subset search algorithm and the novel Fast Branch & Bound. Suppose our task is to select 2 items out of the total number of 6 items (features) so as to maximize an adopted criterion function. During demonstrations the red color denotes criterion value computation, yellow color denotes much faster criterion value prediction. For explanation see [1].
Track the algorithm by clicking the green right-arrow:
 



Classical (non-predictive) and Fast (predictive) Branch & Bound comparison

This presentation demonstrates differences between the classical and the Fast Branch & Bound (FBB) subset search algorithms by showing their parallel run. A sample problem of selecting 2 out of 8 features serves also for illustration of two possible types of FBB prediction mechanism errors. Red depicts true criterion evaluation, yellow depicts fast predictions. For more details see [1].
Start the demonstration by clicking the green right-arrow:
 



Floating vs. Oscillating Search

This animated picture illustrates the course of Floating Search [7] and Oscillating Search [4] algorithms. The SFFS starts with an empty set and successively increases the number of selected features. Any time it adds a feature it tries to backtrack so as to find better, previously unknown solution. The Oscillating Search algorithm modifies successively the current subset of d features as long as the subset gets improved. For details see [7,4].
 



    References (ours, selected):
  1. Somol P., Pudil P., Kittler J. (UK): Fast Branch & Bound Algorithms For Optimal Feature Selection. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(7):900–912, July 2004.
  2. Somol P., Pudil P.: Feature Selection Toolbox. Pattern Recognition 35(12):2749--2759, 2002.
  3. Somol P., Pudil P., Grim J.: Branch & Bound Algorithm with Partial Prediction For Use with Recursive and Non-Recursive Criterion Forms. Lecture Notes in Computer Science LNCS 2013, Springer, pp. 230-239, 2001.
  4. Somol P., Pudil P.: Oscillating search algorithms for feature selection. In: Proceedings of the 15th International Conference on Pattern Recognition. (Sanfeliu A., Villanueva J. J., Vanrell M. eds.). IEEE Computer Society, Los Alamitos, pp. 406-409, 2000.
  5. Somol P., Pudil P., Novovicova J., Paclik P.: Adaptive floating search methods in feature selection. Pattern Recognition Letters, 20 (1999), 11/13, 1157-1163.
  6. Novovicova J., Pudil P., Kittler J. (UK): Divergence Based Feature Selection for Multimodal Class Densities. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(2):218-223, 1996.
  7. Pudil P., Novovicova J., Kittler J. (UK): Floating Search Methods in Feature Selection. Pattern Recognition Letters, 15(11):1119--1125, 1994.

    References (general):
  1. S.Theodoridis and K.Koutroumbas. Pattern Recognition: 2nd edition. Academic Press, Inc., 2003.
  2. A.Webb. Statistical Pattern Recognition: 2nd edition. John Wiley & Sons, 2002.
  3. R.O.Duda, P.E.Hart and D.G.Stork. Pattern Classification: 2nd edition. Wiley-Interscience, 2000.
  4. K.Fukunaga. Introduction to Statistical Pattern Recognition: 2nd edition. Academic Press, Inc., 1990.
  5. P.A. Devijver and J.Kittler. Pattern Recognition: A Statistical Approach. Prentice-Hall, 1982.


   Back to the demonstration list.