Floating Search principle explanation |
This presentation explains the principle of the classical Floating subset search algorithm in its backward form. Suppose our task is to select a subset of items out of the total number of 6 items (features) so as to maximize an adopted criterion function. The Floating search algorithm finds solutions for all subset sizes in one run. For full explanation see [5,7]. Track the algorithm by clicking the green right-arrow: |
(Fast) Branch & Bound basic principle explanation |
This presentation explains the principle of the classical Branch & Bound subset search algorithm and the novel Fast Branch & Bound. Suppose our task is to select 2 items out of the total number of 6 items (features) so as to maximize an adopted criterion function. During demonstrations the red color denotes criterion value computation, yellow color denotes much faster criterion value prediction. For explanation see [1]. Track the algorithm by clicking the green right-arrow: |
Classical (non-predictive) and Fast (predictive) Branch & Bound comparison |
This presentation demonstrates differences between the classical and the Fast Branch & Bound (FBB) subset search algorithms by showing their parallel run. A sample problem of selecting 2 out of 8 features serves also for illustration of two possible types of FBB prediction mechanism errors. Red depicts true criterion evaluation, yellow depicts fast predictions. For more details see [1]. Start the demonstration by clicking the green right-arrow: |
Floating vs. Oscillating Search |
This animated picture illustrates the course of Floating Search [7] and Oscillating Search [4] algorithms. The SFFS starts with an empty set and successively increases the number of selected features. Any time it adds a feature it tries to backtrack so as to find better, previously unknown solution. The Oscillating Search algorithm modifies successively the current subset of d features as long as the subset gets improved. For details see [7,4]. |