Skip to content

Bagging of deep learners

As JBM recently suggested, several classifiers could be trained on largely overlapping subsets, to use them all and get prediction "averages" that we expect to be more "stable".

Multiple encoders would also be fine-tuned, using a common pretrained encoder.

The current code base in modules maggotuba.models.modules and maggotuba.models.trainers has been ready for such an extension since the implementation of MultiscaleMaggotTrainer, a multi-encoder single-classifier neural network. By the way, this latter feature has been under-used lately and would need to be tested again/refreshed.