# Semi-supervised MarginBoost

@inproceedings{dAlchBuc2001SemisupervisedM, title={Semi-supervised MarginBoost}, author={Florence d'Alch{\'e}-Buc and Yves Grandvalet and Christophe Ambroise}, booktitle={NIPS}, year={2001} }

In many discrimination problems a large amount of data is available but only a few of them are labeled. This provides a strong motivation to improve or develop methods for semi-supervised learning. In this paper, boosting is generalized to this task within the optimization framework of MarginBoost. We extend the margin definition to unlabeled data and develop the gradient descent algorithm that corresponds to the resulting margin cost function. This meta-learning scheme can be applied to any… Expand

#### 86 Citations

A Direct Boosting Approach for Semi-supervised Classification

- Computer Science
- IJCAI
- 2015

We introduce a semi-supervised boosting approach (SSDBoost), which directly minimizes the classification errors and maximizes the margins on both labeled and unlabeled samples, without resorting to… Expand

On Improving Semi-supervised Marginboost Incrementally using Strong Unlabeled Data

- Computer Science
- ICPRAM
- 2012

An incremental learning strategy by which the classification accuracy of the semi-supervised MarginBoost (SSMB) algorithm can be improved by employing a small amount of “strong” samples selected from the unlabeled data per iteration. Expand

Regularized Boost for Semi-Supervised Learning

- Computer Science
- NIPS
- 2007

A local smoothness regularizer is introduced to semi-supervised boosting algorithms based on the universal optimization framework of margin cost functionals to improve their generalization and speed up their training. Expand

Semi-supervised Robust Alternating AdaBoost

- Computer Science
- CIARP
- 2009

This paper intends to use the techniques of Semi-Supervised Learning to boost the performance of the Robust Alternating AdaBoost algorithm, introducing the algorithm RADA+ and comparing it with RADA, reporting the performance results using synthetic and real data sets. Expand

Boosting for multiclass semi-supervised learning

- Computer Science
- Pattern Recognit. Lett.
- 2014

This work proposes a multiclass semi-supervised boosting algorithm that solves multiclass classification problems directly using a novel multiclass loss function consisting of the margin cost on labeled data and two regularization terms on labeled and unlabeled data. Expand

SERBoost: Semi-supervised Boosting with Expectation Regularization

- Mathematics, Computer Science
- ECCV
- 2008

A novel semi-supervised boosting method, called SERBoost, that can be applied to large scale vision problems and provides a margin regularizer for the boosting cost function and shows a principled way of utilizing prior knowledge. Expand

SemiBoost: Boosting for Semi-Supervised Learning

- Computer Science, Medicine
- IEEE Transactions on Pattern Analysis and Machine Intelligence
- 2009

A boosting framework for semi-supervised learning, termed as SemiBoost, that improves the performance of several commonly used supervised learning algorithms, given a large number of unlabeled examples and is comparable to the state-of-the-art semi- supervised learning algorithms. Expand

Semi-Supervised Learning via Regularized Boosting Working on Multiple Semi-Supervised Assumptions

- Mathematics, Computer Science
- IEEE Transactions on Pattern Analysis and Machine Intelligence
- 2011

This paper proposes a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions and demonstrates that the algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi- supervised learning algorithms, including newly developed boosting algorithms. Expand

Boosting with pairwise constraints

- Mathematics, Computer Science
- Neurocomputing
- 2010

A novel framework to solve the problem of extending AdaBoost to semi-supervised scenarios based on the gradient descent view of boosting is proposed, which is almost as simple and flexible as AdaBoost, and can be readily applied in the presence of pairwise constraints. Expand

Laplacian Margin Distribution Boosting for Learning from Sparsely Labeled Data

- Computer Science, Mathematics
- 2011 International Conference on Digital Image Computing: Techniques and Applications
- 2011

This paper proposes a novel data-dependent margin distribution learning criterion for boosting, termed Laplacian MDBoost, which utilizes the intrinsic geometric structure of dataset and derives a dual formulation of the learning problem that can be efficiently solved by column generation. Expand

#### References

SHOWING 1-10 OF 20 REFERENCES

Boosting Mixture Models for Semi-supervised Learning

- Computer Science
- ICANN
- 2001

MixtBoost is introduced, a variant of AdaBoost dedicated to solve problems in which both labeled and unlabeled data are available, and improves on both mixture models and AdaBoost provided classes are structured, and is otherwise similar to AdaBoost. Expand

Boosting the margin: A new explanation for the effectiveness of voting methods

- Mathematics, Computer Science
- ICML
- 1997

It is shown that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. Expand

Combining labeled and unlabeled data with co-training

- Computer Science
- COLT' 98
- 1998

A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples. Expand

Semi-Supervised Support Vector Machines

- Mathematics, Computer Science
- NIPS
- 1998

A general S3VM model is proposed that minimizes both the misclassification error and the function capacity based on all the available data that can be converted to a mixed-integer program and then solved exactly using integer programming. Expand

Experiments with a New Boosting Algorithm

- Computer Science
- ICML
- 1996

This paper describes experiments carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems and compared boosting to Breiman's "bagging" method when used to aggregate various classifiers. Expand

EM Algorithm for Partially Known Labels

- Computer Science
- 2000

A generalization of the Expectation-Maximization algorithm which can take into account partial information about the observation labels for mixture density estimation is proposed. Expand

Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By

- 2000

The main and important contribution of this paper is in establishing a connection between boosting, a newcomer to the statistics scene, and additive models. One of the main properties of boosting… Expand

Text Classification from Labeled and Unlabeled Documents using EM

- Mathematics
- 2000

This paper shows that the accuracy of learned text classifiers can be improved by augmenting a small number of labeled training documents with a large pool of unlabeled documents. This is important...

Functional Gradient Techniques for Combining Hypotheses

- Computer Science
- 2000

This chapter contains sections titled: Introduction, Optimizing Cost Functions of the Margin, A Gradient Descent View of Voting Methods, Theoretically Motivated Cost Functions, Convergence Results,… Expand

Prediction Games and Arcing Algorithms

- Mathematics, Computer Science
- Neural Computation
- 1999

The theory behind the success of adaptive reweighting and combining algorithms (arcing) such as Adaboost and others in reducing generalization error has not been well understood, and an explanation of whyAdaboost works in terms of its ability to produce generally high margins is offered. Expand