One norm linear programming support vector regression

Mohammad Tanveer*, Mohit Mangal, Izhar Ahmad, Yuan Hai Shao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Scopus citations


In this paper, a new linear programming formulation of a 1-norm support vector regression (SVR) is proposed whose solution is obtained by solving an exterior penalty problem in the dual space as an unconstrained minimization problem using Newton method. The solution of modified unconstrained minimization problem reduces to solving just system of linear equations as opposed to solving quadratic programming problem in SVR, which leads to extremely simple and fast algorithm. The algorithm converges from any starting point and can be easily implemented in MATLAB without using any optimization packages. The main advantage of the proposed approach is that it leads to a robust and sparse model representation meaning that many components of the optimal solution vector will become zero and therefore the decision function can be determined using much less number of support vectors in comparison to SVR, smooth SVR (SSVR) and weighted SVR (WSVR). To demonstrate its effectiveness, experiments were performed on well-known synthetic and real-world benchmark datasets. Similar or better generalization performance of the proposed method in less training time in comparison with SVR, SSVR and WSVR clearly exhibits its suitability and applicability.

Original languageEnglish
Pages (from-to)1508-1518
Number of pages11
StatePublished - 15 Jan 2016

Bibliographical note

Publisher Copyright:
© 2015 Elsevier B.V..


  • 1-Norm support vector machines
  • Linear programming
  • Newton method
  • Unconstrained convex minimization

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'One norm linear programming support vector regression'. Together they form a unique fingerprint.

Cite this