Home > @AdaBooster > AdaBooster.m

AdaBooster

PURPOSE ^

function [ab] = adaBooster(cl, nStages)

SYNOPSIS ^

function [ab] = AdaBooster(cl)

DESCRIPTION ^

 function [ab] = adaBooster(cl, nStages)
    constructor of the adaBooster class that inherits from the classifier
    class. the adaBooster implements the AdaBoost boosting algorithm to
    build a strong (boosted) classifier from several weak classifiers.

    Inputs:
       cl: the weak classifier that needs to be boosted or another
           adaBooster that was trained with a specific number of stages
           and we need to increase them (default [])

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function [ab] = AdaBooster(cl)
0002 % function [ab] = adaBooster(cl, nStages)
0003 %    constructor of the adaBooster class that inherits from the classifier
0004 %    class. the adaBooster implements the AdaBoost boosting algorithm to
0005 %    build a strong (boosted) classifier from several weak classifiers.
0006 %
0007 %    Inputs:
0008 %       cl: the weak classifier that needs to be boosted or another
0009 %           adaBooster that was trained with a specific number of stages
0010 %           and we need to increase them (default [])
0011 
0012 % parameters needed for training
0013 % the error bound after reaching which the classifier stops learning,
0014 % used only when the nStages argument to learn is Inf
0015 ab.errBound = 0.001;
0016 
0017 % paramters needed to define the classifier
0018 if nargin == 0
0019     ab.weakCl = [];
0020 else
0021     ab.weakCl = cl;
0022 end
0023 
0024 % parameters to be learned by the classifier
0025 % 1. a cell array of trained instances of the weak classifier
0026 ab.trndCls = {};
0027 
0028 % 2. the weight function that computes the output of adaBooster as a
0029 % function of the outputs of the trained weak classifiers
0030 ab.clsWeights = [];
0031 
0032 % 3. example weights of the last iteration of adaBoost learning
0033 % algorithm. this is saved after training the classifier to be used
0034 % later if we want to increase the number of stages later on
0035 ab.lastExWeights = [];
0036 
0037 % 4. the threshold whereby the classifier can distinguish between the
0038 % positive and negative examples
0039 ab.thresh = NaN;
0040 
0041 % 5. detection rate after training
0042 ab.detectionRate = NaN;
0043 
0044 ab = class(ab, 'AdaBooster', Classifier(getNumClasses(cl)));

Generated on Sun 29-Sep-2013 01:25:24 by m2html © 2005