### Abstract

We describe an iterative algorithm for building vector machines used in classification tasks. The algorithm builds on ideas from support vector machines, boosting, and generalized additive models. The algorithm can be used with various continuously differential functions that bound the discrete (0-1) classification loss and is very simple to implement. We test the proposed algorithm with two different loss functions on synthetic and natural data. We also describe a norm-penalized version of the algorithm for the exponential loss function used in AdaBoost. The performance of the algorithm on natural data is comparable to support vector machines while typically its running time is shorter than of SVM.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 12 - Proceedings of the 1999 Conference, NIPS 1999 |

Publisher | Neural information processing systems foundation |

Pages | 610-616 |

Number of pages | 7 |

ISBN (Print) | 0262194503, 9780262194501 |

State | Published - Jan 1 2000 |

Externally published | Yes |

Event | 13th Annual Neural Information Processing Systems Conference, NIPS 1999 - Denver, CO, United States Duration: Nov 29 1999 → Dec 4 1999 |

### Publication series

Name | Advances in Neural Information Processing Systems |
---|---|

ISSN (Print) | 1049-5258 |

### Other

Other | 13th Annual Neural Information Processing Systems Conference, NIPS 1999 |
---|---|

Country | United States |

City | Denver, CO |

Period | 11/29/99 → 12/4/99 |

### All Science Journal Classification (ASJC) codes

- Computer Networks and Communications
- Information Systems
- Signal Processing

## Fingerprint Dive into the research topics of 'Leveraged vector machines'. Together they form a unique fingerprint.

## Cite this

*Advances in Neural Information Processing Systems 12 - Proceedings of the 1999 Conference, NIPS 1999*(pp. 610-616). (Advances in Neural Information Processing Systems). Neural information processing systems foundation.