Member-only story

50+ Facts About Dimensionality Reduction

btd
3 min readNov 28, 2023

--

Here’s a list of 50+ facts about Dimensionality Reduction:

Introduction to Dimensionality Reduction:

  1. Dimensionality reduction involves reducing the number of features in a dataset.
  2. It is commonly used in machine learning to address the curse of dimensionality.
  3. The curse of dimensionality refers to challenges and increased computational complexity in high-dimensional data.

Linear Dimensionality Reduction Techniques:

  1. Principal Component Analysis (PCA) is a widely used linear dimensionality reduction technique.
  2. PCA finds principal components, orthogonal vectors capturing maximum data variance.
  3. Singular Value Decomposition (SVD) is used to implement PCA.

Considerations for PCA:

  1. PCA is sensitive to feature scale; standardization or normalization is important.
  2. The Elbow Method is used to determine optimal dimensions to retain in PCA.
  3. The eigenvalues of the covariance matrix in PCA represent variance along principal components.

Non-linear Dimensionality Reduction Techniques:

--

--

btd
btd

No responses yet