Skip to main content

Support Vector Representation of Multi-categorical Data

  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 2002 (ICANN 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2415))

Included in the following conference series:

Abstract

We propose a new algorithm for the categorisation of data into multiple classes. It minimises a quadratic homogeneous program, and can be viewed as a generalisation of the well known support vector machines to multiple classes. For only one class it reduces to a quadratic problem, whose solution can be seen as an estimate of the support of a distribution. Given a set of labelled data, our algorithm estimates for each class a representative vector in a feature space. Each of these vectors is expressible as a linear combination of the training data in its class, mapped into feature space. Therefore our algorithm needs less parameters than other multi-class support vector approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. E. Boser, I. M. Guyon, and V. N. Vapnik. A training algorithm for optimal mar-gin classifiers. In Proceedings of the 5th Annual ACM Workshop on Computational Learning theory, 1992.

    Google Scholar 

  2. V. Vapnik. Statistical Learning Theory. Wiley, 1998.

    Google Scholar 

  3. J. Weston and C. Watkins. Support vector machines for multi-class pattern recogni-tion. In Proceedings of the 6th European Symposium on Artificial Neural Networks (ESANN), 1999.

    Google Scholar 

  4. K. Cramer and Y. Singer. On the learnability and design of output codes for mul-ticlass Problems. Computational Learning Theory, pages 35–46, 2000.

    Google Scholar 

  5. Y. Lee, Y. Lin, and G. Wahba. Multicategory support vector machines. In Proceed-ings of the 33rd Symposium on the Interface, 2001.

    Google Scholar 

  6. B. Schölkopf, J. C. Platt, J. Shawe-Taylor, A. J. Smola, and R. C. Williamson. Estimating the support of a high-dimensional distribution. Neural Computation, 13(7), 2001.

    Google Scholar 

  7. D. P. Bertsekas. Nonlinear Programming. Athena Scientific, 1995.

    Google Scholar 

  8. B. Schölkopf, A. J. Smola, R. C. Williamson, and P. L. Bartlett. New support vector algorithms. Neural Computation, 12(5), 2000.

    Google Scholar 

  9. B. Schölkopf. Support Vector Learning. R. Oldenbourg Verlag, Munich, 1997.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Borer, S., Gerstner, W. (2002). Support Vector Representation of Multi-categorical Data. In: Dorronsoro, J.R. (eds) Artificial Neural Networks — ICANN 2002. ICANN 2002. Lecture Notes in Computer Science, vol 2415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46084-5_119

Download citation

  • DOI: https://doi.org/10.1007/3-540-46084-5_119

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44074-1

  • Online ISBN: 978-3-540-46084-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics