Member-only story

100 Facts About Support Vector Machines (SVMs)

btd
7 min readNov 28, 2023

--

Here’s a list of 100 technical facts about Support Vector Machines (SVMs):

  1. Support Vector Machines (SVMs) are a class of supervised learning algorithms used for classification and regression tasks.
  2. SVMs are effective in high-dimensional spaces and can handle both linear and non-linear relationships between features and labels.
  3. The objective of SVMs is to find the hyperplane that best separates classes in the input space.
  4. The hyperplane in SVM is the decision boundary that maximizes the margin between classes.
  5. The margin is the distance between the hyperplane and the nearest data point from either class, known as a support vector.
  6. SVMs aim to maximize the margin, leading to better generalization to unseen data.
  7. Linear SVMs work well when the data is linearly separable.
  8. Non-linear SVMs use kernel functions to map the input space into a higher-dimensional feature space, making it easier to find a separating hyperplane.
  9. Common kernel functions include linear, polynomial, radial basis function (RBF or Gaussian), and sigmoid.
  10. The choice of the kernel and its parameters significantly affects SVM performance.

--

--

btd
btd

Responses (2)