-INTRODUCTION
+######################################################################
+## INTRODUCTION
This is the C++ implementation of the folded hierarchy of
classifiers for cat detection described in
Please cite this paper when referring to this software.
-INSTALLATION
+######################################################################
+## INSTALLATION
This program was developed on Debian GNU/Linux computers with the
following main tool versions
You can also run the full thing with the following commands if you
have wget installed
- wget http://www.idiap.ch/folded-ctf/not-public-yet/data/folding-gpl.tgz
- tar zxvf folding-gpl.tgz
- cd folding
- wget http://www.idiap.ch/folded-ctf/not-public-yet/data/rmk.tgz
- tar zxvf rmk.tgz
- ./run.sh
+ > wget http://www.idiap.ch/folded-ctf/not-public-yet/data/folding-gpl.tgz
+ > tar zxvf folding-gpl.tgz
+ > cd folding
+ > wget http://www.idiap.ch/folded-ctf/not-public-yet/data/rmk.tgz
+ > tar zxvf rmk.tgz
+ > ./run.sh
Note that every one of the twenty rounds of training/testing takes
more than three days on a powerful PC. However, the script detects
You are welcome to send bug reports and comments to fleuret@idiap.ch
-PARAMETERS
+######################################################################
+## PARAMETERS
To set the value of a parameter during an experiment, just add an
argument of the form --parameter-name=value before the commands that
* pictures-for-article ("no")
- Should the pictures be generated to be clear in b&w
+ Should the pictures be generated for printing in black and white.
* pool-name (no default)
* result-path ("/tmp/")
- In what directory should we save all the produced file during the
+ In what directory should we save all the produced files during the
computation.
* loss-type ("exponential")
- What kind of loss to use for the boosting. While different loss are
- implementer in the code, only the exponential has been thoroughly
- tested.
+ What kind of loss to use for the boosting. While different losses
+ are implemented in the code, only the exponential has been
+ thoroughly tested.
* nb-images (-1)
* tree-depth-max (1)
Maximum depth of the decision trees used as weak learners in the
- classifier.
+ classifier. The default value corresponds to stumps.
* proportion-negative-cells-for-training (0.025)
* nb-negative-samples-per-positive (10)
- How many negative cell to sample for every positive cell during
+ How many negative cells to sample for every positive cell during
training.
* nb-features-for-boosting-optimization (10000)
How many pose-indexed features to use at every step of boosting.
- * force-head-belly-independence (no)
+ * force-head-belly-independence ("no")
Should we force the independence between the two levels of the
detector (i.e. make an H+B detector)
* nb-levels (1)
- How many levels in the hierarchy, this is 2 for the JMLR paper
- experiments.
+ How many levels in the hierarchy. This should be 2 for the JMLR
+ paper experiments.
* proportion-for-train (0.5)
Should we display a progress bar.
-COMMANDS
+######################################################################
+## COMMANDS
* open-pool
Write PNG images of the scenes in the pool.
- --
- Francois Fleuret
- October 2008
+--
+Francois Fleuret
+October 2008
if(detector) {
int nb_features = 100;
for(int f = 0; f < nb_features; f++)
- if(f == 0 || f ==50 || f == 53) {
+ if(f == 0 || f == 50 || f == 53) {
int n_family, n_feature;
if(f < nb_features/2) {
n_family = 0;
u++;
}
- // sprintf(buffer, "/tmp/image-%05d.png", i);
- // cout << "Writing " << buffer << endl;
- // result_sp.write_png(buffer);
-
- // if(global.write_tag_images) {
- // sprintf(buffer, "/tmp/image-%05d_tags.png", i);
- // cout << "Writing " << buffer << endl;
- // image->compute_rich_structure();
- // image->write_tag_png(buffer);
- // }
-
pool->release_image(i);
}