This is the conference version of my paper on **Coresets for Polytope Distance and Support Vector Machines**, which appeared at SoCG 2009.

*Two implications for machine learning:*

All commonly used Support Vector Machine variants can be trained *in linear time* (in the number of datapoints) up to arbitrary fixed quality, and using arbitrary kernels. (Here time means number of kernel evaluations, and quality means the percentage of the optimum margin attained).

The coreset analysis implies nearly matching upper and lower bounds for the *sparsity* (or in other words the number of support vectors) of approximate solutions of a given quality.

Also attached below, you find the slides of my seminar talk of October 9th 2008 on the same topic: