Skip to main content

Extremely Accurate Symbolic Regression for Large Feature Problems

  • Chapter
  • First Online:
Genetic Programming Theory and Practice XII

Part of the book series: Genetic and Evolutionary Computation ((GEVO))

Abstract

As symbolic regression (SR) has advanced into the early stages of commercial exploitation, the poor accuracy of SR, still plaguing even the most advanced commercial packages, has become an issue for early adopters. Users expect to have the correct formula returned, especially in cases with zero noise and only one basis function with minimally complex grammar depth.

At a minimum, users expect the response surface of the SR tool to be easily understood, so that the user can know apriori on what classes of problems to expect excellent, average, or poor accuracy. Poor or unknown accuracy is a hinderence to greater academic and industrial acceptance of SR tools.

In a previous paper, we published a complex algorithm for modern symbolic regression which is extremely accurate for a large class of Symbolic Regression problems. The class of problems, on which SR is extremely accurate, was described in detail. This algorithm was extremely accurate, on a single processor, for up to 25 features (columns); and, a cloud configuration was used to extend the extreme accuracy up to as many as 100 features.

While the previous algorithm’s extreme accuracy for deep problems with a small number of features (25–100) was an impressive advance, there are many very important academic and industrial SR problems requiring from 100 to 1000 features.

In this chapter we extend the previous algorithm such that high accuracy is achieved on a wide range of problems, from 25 to 3000 features, using only a single processor. The class of problems, on which the enhanced algorithm is highly accurate, is described in detail. A definition of extreme accuracy is provided, and an informal argument of highly SR accuracy is outlined in this chapter.

The new enhanced algorithm is tested on a set of representative problems. The enhanced algorithm is shown to be robust, performing well even in the face of testing data containing up to 3000 features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Testing a single regression champion is not cheap. At a minimum testing a single regression champion requires as many evaluations as there are training examples as well as performing a simple regression. At a maximum testing a single regression champion may require performing a much more expensive multiple regression.

  2. 2.

    As a reminder, testing a single regression champion is not cheap. At a minimum testing a single regression champion requires as many evaluations as there are training examples as well as performing a simple regression. At a maximum testing a single regression champion may require performing a much more expensive multiple regression.

References

  • Draper NR, Smith H (1981) Applied regression analysis, 2nd edn. Wiley, New York

    MATH  Google Scholar 

  • Hoerl A (1962) Application of ridge analysis to regression problems. Chem Eng Prog 58:54–59

    Google Scholar 

  • Korns MF (2010) Abstract expression grammar symbolic regression. In: Riolo R, McConaghy T, Vladislavleva E (eds) Genetic programming theory and practice VIII, genetic and evolutionary computation, vol 8. Springer, Ann Arbor, chap 7, pp 109–128. http://www.springer.com/computer/ai/book/978-1-4419-7746-5

  • Korns MF (2011) Accuracy in symbolic regression. In: Riolo R, Vladislavleva E, Moore JH (eds) Genetic programming theory and practice IX, genetic and evolutionary computation. Springer, Ann Arbor, chap 8, pp 129–151. doi:10.1007/978-1-4614-1770-5-8

    Google Scholar 

  • Korns MF (2012) A baseline symbolic regression algorithm. In: Riolo R, Vladislavleva E, Ritchie MD, Moore JH (eds) Genetic programming theory and practice X, genetic and evolutionary computation. Springer, Ann Arbor, chap 9, pp 117–137.doi:10.1007/978-1-4614-6846-2-9, URL http://dx.doi.org/10.1007/978-1-4614-6846-2-9

  • Korns MF (2013) Extreme accuracy in symbolic regression. In: Riolo R, Moore JH, Kotanchek M (eds) Genetic programming theory and practice XI, genetic and evolutionary computation. Springer, Ann Arbor, chap 1, pp 1–30. doi:10.1007/978-1-4939-0375-7-1

    Google Scholar 

  • Kotanchek M, Smits G, Vladislavleva E (2007) Trustable symbolic regression models: using ensembles, interval arithmetic and pareto fronts to develop robust and trust-aware models. In: Riolo RL, Soule T, Worzel B (eds) Genetic programming theory and practice V, genetic and evolutionary computation. Springer, Ann Arbor, chap 12, pp 201–220. doi:10.1007/978-0-387-76308-8-12

    Google Scholar 

  • Koza JR (1992) Genetic programming: on the programming of computers by means of natural selection. MIT, Cambridge. http://mitpress.mit.edu/books/genetic-programming

    MATH  Google Scholar 

  • McConaghy T (2011) FFX: fast, scalable, deterministic symbolic regression technology. In: Riolo R, Vladislavleva E, Moore JH (eds) Genetic programming theory and practice IX, genetic and evolutionary computation. Springer, Ann Arbor, chap 13, pp 235–260. doi:10.1007/978-1-4614-1770-5-13, http://trent.st/content/2011-GPTP-FFX-paper.pdf

  • Nelder J, Wedderburn R (1972) Generalized linear models. J Royal Stat Soc Ser A 135:370–384

    Google Scholar 

  • Smits G, Kotanchek M (2004) Pareto-front exploitation in symbolic regression. In: O’Reilly UM, Yu T, Riolo RL, Worzel B (eds) Genetic programming theory and practice II. Springer, Ann Arbor, chap 17, pp 283–299. doi:10.1007/0-387-23254-0-17

    Google Scholar 

  • Tibshirani R (1996) Regression shrinkage and selection via the lasso. J Royal Stat Soc Ser B (Methodological) 58:267–288

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael F. Korns .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Korns, M. (2015). Extremely Accurate Symbolic Regression for Large Feature Problems. In: Riolo, R., Worzel, W., Kotanchek, M. (eds) Genetic Programming Theory and Practice XII. Genetic and Evolutionary Computation. Springer, Cham. https://doi.org/10.1007/978-3-319-16030-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-16030-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-16029-0

  • Online ISBN: 978-3-319-16030-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics