{"id":7765,"date":"2025-11-26T18:36:57","date_gmt":"2025-11-26T18:36:57","guid":{"rendered":"https:\/\/uplatz.com\/blog\/?p=7765"},"modified":"2025-11-26T18:36:57","modified_gmt":"2025-11-26T18:36:57","slug":"pca-dimensionality-reduction-explained","status":"publish","type":"post","link":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/","title":{"rendered":"PCA (Dimensionality Reduction) Explained"},"content":{"rendered":"<h1 data-start=\"685\" data-end=\"749\"><strong data-start=\"687\" data-end=\"749\">PCA (Dimensionality Reduction): A Complete Practical Guide<\/strong><\/h1>\n<p data-start=\"751\" data-end=\"1103\">Modern machine learning works with large datasets that may contain hundreds or even thousands of features. While more data can improve predictions, too many features often reduce performance. This is where <strong data-start=\"957\" data-end=\"995\">PCA (Principal Component Analysis)<\/strong> becomes essential. PCA helps reduce the number of features while preserving the most important information.<\/p>\n<p data-start=\"1105\" data-end=\"1319\">PCA improves <strong data-start=\"1118\" data-end=\"1133\">model speed<\/strong>, <strong data-start=\"1135\" data-end=\"1147\">accuracy<\/strong>, <strong data-start=\"1149\" data-end=\"1162\">stability<\/strong>, and <strong data-start=\"1168\" data-end=\"1186\">visual clarity<\/strong>. It is widely used in data science, AI pipelines, image compression, finance, healthcare, cybersecurity, and recommendation systems.<\/p>\n<p data-start=\"1321\" data-end=\"1588\"><strong data-start=\"1321\" data-end=\"1405\">\ud83d\udc49 To master PCA and full Machine Learning workflows, explore our courses below:<\/strong><br data-start=\"1405\" data-end=\"1408\" \/>\ud83d\udd17 <strong data-start=\"1411\" data-end=\"1429\">Internal Link:<\/strong>\u00a0<a href=\"https:\/\/uplatz.com\/course-details\/python-for-data-science\/792\">https:\/\/uplatz.com\/course-details\/python-for-data-science\/792<\/a><br data-start=\"1496\" data-end=\"1499\" \/>\ud83d\udd17 <strong data-start=\"1502\" data-end=\"1525\">Outbound Reference:<\/strong> <a class=\"decorated-link cursor-pointer\" target=\"_new\" rel=\"noopener\" data-start=\"1526\" data-end=\"1588\">https:\/\/scikit-learn.org\/stable\/modules\/decomposition.html#pca<\/a><\/p>\n<hr data-start=\"1590\" data-end=\"1593\" \/>\n<h2 data-start=\"1595\" data-end=\"1648\"><strong data-start=\"1598\" data-end=\"1648\">1. What Is PCA (Principal Component Analysis)?<\/strong><\/h2>\n<p data-start=\"1650\" data-end=\"1846\">PCA is an <strong data-start=\"1660\" data-end=\"1695\">unsupervised learning technique<\/strong> used for <strong data-start=\"1705\" data-end=\"1733\">dimensionality reduction<\/strong>. It transforms a large set of variables into a smaller set that still contains most of the original information.<\/p>\n<p data-start=\"1848\" data-end=\"1864\">In simple words:<\/p>\n<blockquote data-start=\"1866\" data-end=\"1942\">\n<p data-start=\"1868\" data-end=\"1942\">PCA finds the most important directions in your data and removes the rest.<\/p>\n<\/blockquote>\n<p data-start=\"1944\" data-end=\"2001\">These new directions are called <strong data-start=\"1976\" data-end=\"2000\">principal components<\/strong>.<\/p>\n<p data-start=\"2003\" data-end=\"2028\">Each principal component:<\/p>\n<ul data-start=\"2029\" data-end=\"2135\">\n<li data-start=\"2029\" data-end=\"2070\">\n<p data-start=\"2031\" data-end=\"2070\">Is a combination of original features<\/p>\n<\/li>\n<li data-start=\"2071\" data-end=\"2105\">\n<p data-start=\"2073\" data-end=\"2105\">Is independent from the others<\/p>\n<\/li>\n<li data-start=\"2106\" data-end=\"2135\">\n<p data-start=\"2108\" data-end=\"2135\">Captures maximum variance<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"2137\" data-end=\"2140\" \/>\n<h2 data-start=\"2142\" data-end=\"2193\"><strong data-start=\"2145\" data-end=\"2193\">2. Why Dimensionality Reduction Is Important<\/strong><\/h2>\n<p data-start=\"2195\" data-end=\"2249\">High-dimensional data causes several serious problems.<\/p>\n<hr data-start=\"2251\" data-end=\"2254\" \/>\n<h3 data-start=\"2256\" data-end=\"2295\"><strong data-start=\"2260\" data-end=\"2295\">2.1 The Curse of Dimensionality<\/strong><\/h3>\n<p data-start=\"2297\" data-end=\"2333\">As the number of features increases:<\/p>\n<ul data-start=\"2335\" data-end=\"2477\">\n<li data-start=\"2335\" data-end=\"2358\">\n<p data-start=\"2337\" data-end=\"2358\">Data becomes sparse<\/p>\n<\/li>\n<li data-start=\"2359\" data-end=\"2398\">\n<p data-start=\"2361\" data-end=\"2398\">Distance-based models lose accuracy<\/p>\n<\/li>\n<li data-start=\"2399\" data-end=\"2424\">\n<p data-start=\"2401\" data-end=\"2424\">Training becomes slow<\/p>\n<\/li>\n<li data-start=\"2425\" data-end=\"2451\">\n<p data-start=\"2427\" data-end=\"2451\">Memory usage increases<\/p>\n<\/li>\n<li data-start=\"2452\" data-end=\"2477\">\n<p data-start=\"2454\" data-end=\"2477\">Models overfit easily<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2479\" data-end=\"2510\">PCA helps control this problem.<\/p>\n<hr data-start=\"2512\" data-end=\"2515\" \/>\n<h3 data-start=\"2517\" data-end=\"2550\"><strong data-start=\"2521\" data-end=\"2550\">2.2 Faster Model Training<\/strong><\/h3>\n<p data-start=\"2552\" data-end=\"2572\">With fewer features:<\/p>\n<ul data-start=\"2573\" data-end=\"2676\">\n<li data-start=\"2573\" data-end=\"2600\">\n<p data-start=\"2575\" data-end=\"2600\">Training becomes faster<\/p>\n<\/li>\n<li data-start=\"2601\" data-end=\"2630\">\n<p data-start=\"2603\" data-end=\"2630\">Prediction becomes faster<\/p>\n<\/li>\n<li data-start=\"2631\" data-end=\"2653\">\n<p data-start=\"2633\" data-end=\"2653\">Storage needs drop<\/p>\n<\/li>\n<li data-start=\"2654\" data-end=\"2676\">\n<p data-start=\"2656\" data-end=\"2676\">Cloud costs reduce<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"2678\" data-end=\"2681\" \/>\n<h3 data-start=\"2683\" data-end=\"2715\"><strong data-start=\"2687\" data-end=\"2715\">2.3 Better Visualisation<\/strong><\/h3>\n<p data-start=\"2717\" data-end=\"2727\">Data with:<\/p>\n<ul data-start=\"2728\" data-end=\"2783\">\n<li data-start=\"2728\" data-end=\"2755\">\n<p data-start=\"2730\" data-end=\"2755\">2 dimensions \u2192 2D plots<\/p>\n<\/li>\n<li data-start=\"2756\" data-end=\"2783\">\n<p data-start=\"2758\" data-end=\"2783\">3 dimensions \u2192 3D plots<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2785\" data-end=\"2886\">But real datasets may have 50+ features. PCA reduces them to 2 or 3 so humans can visualise patterns.<\/p>\n<hr data-start=\"2888\" data-end=\"2891\" \/>\n<h3 data-start=\"2893\" data-end=\"2918\"><strong data-start=\"2897\" data-end=\"2918\">2.4 Reduced Noise<\/strong><\/h3>\n<p data-start=\"2920\" data-end=\"2942\">Many features contain:<\/p>\n<ul data-start=\"2943\" data-end=\"3008\">\n<li data-start=\"2943\" data-end=\"2968\">\n<p data-start=\"2945\" data-end=\"2968\">Redundant information<\/p>\n<\/li>\n<li data-start=\"2969\" data-end=\"2991\">\n<p data-start=\"2971\" data-end=\"2991\">Measurement errors<\/p>\n<\/li>\n<li data-start=\"2992\" data-end=\"3008\">\n<p data-start=\"2994\" data-end=\"3008\">Random noise<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"3010\" data-end=\"3061\">PCA removes weak signals and keeps strong patterns.<\/p>\n<hr data-start=\"3063\" data-end=\"3066\" \/>\n<h2 data-start=\"3068\" data-end=\"3125\"><strong data-start=\"3071\" data-end=\"3125\">3. How PCA Works (Simple Step-by-Step Explanation)<\/strong><\/h2>\n<p data-start=\"3127\" data-end=\"3168\">PCA follows a clear mathematical process.<\/p>\n<hr data-start=\"3170\" data-end=\"3173\" \/>\n<h3 data-start=\"3175\" data-end=\"3211\"><strong data-start=\"3179\" data-end=\"3211\">Step 1: Standardise the Data<\/strong><\/h3>\n<p data-start=\"3212\" data-end=\"3265\">All features are scaled so that no feature dominates.<\/p>\n<hr data-start=\"3267\" data-end=\"3270\" \/>\n<h3 data-start=\"3272\" data-end=\"3317\"><strong data-start=\"3276\" data-end=\"3317\">Step 2: Compute the Covariance Matrix<\/strong><\/h3>\n<p data-start=\"3318\" data-end=\"3363\">This shows how features vary with each other.<\/p>\n<hr data-start=\"3365\" data-end=\"3368\" \/>\n<h3 data-start=\"3370\" data-end=\"3419\"><strong data-start=\"3374\" data-end=\"3419\">Step 3: Find Eigenvectors and Eigenvalues<\/strong><\/h3>\n<ul data-start=\"3420\" data-end=\"3532\">\n<li data-start=\"3420\" data-end=\"3473\">\n<p data-start=\"3422\" data-end=\"3473\"><strong data-start=\"3422\" data-end=\"3438\">Eigenvectors<\/strong> \u2192 Directions of maximum variance<\/p>\n<\/li>\n<li data-start=\"3474\" data-end=\"3532\">\n<p data-start=\"3476\" data-end=\"3532\"><strong data-start=\"3476\" data-end=\"3491\">Eigenvalues<\/strong> \u2192 Amount of variance in each direction<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"3534\" data-end=\"3537\" \/>\n<h3 data-start=\"3539\" data-end=\"3586\"><strong data-start=\"3543\" data-end=\"3586\">Step 4: Select Top Principal Components<\/strong><\/h3>\n<p data-start=\"3587\" data-end=\"3636\">Pick the components with the highest eigenvalues.<\/p>\n<hr data-start=\"3638\" data-end=\"3641\" \/>\n<h3 data-start=\"3643\" data-end=\"3677\"><strong data-start=\"3647\" data-end=\"3677\">Step 5: Transform the Data<\/strong><\/h3>\n<p data-start=\"3678\" data-end=\"3737\">Original features are projected onto the new reduced space.<\/p>\n<p data-start=\"3739\" data-end=\"3810\">The output is a smaller dataset that keeps the most useful information.<\/p>\n<hr data-start=\"3812\" data-end=\"3815\" \/>\n<h2 data-start=\"3817\" data-end=\"3857\"><strong data-start=\"3820\" data-end=\"3857\">4. What Are Principal Components?<\/strong><\/h2>\n<p data-start=\"3859\" data-end=\"3884\">Principal components are:<\/p>\n<ul data-start=\"3885\" data-end=\"4111\">\n<li data-start=\"3885\" data-end=\"3905\">\n<p data-start=\"3887\" data-end=\"3905\">New axes of data<\/p>\n<\/li>\n<li data-start=\"3906\" data-end=\"3950\">\n<p data-start=\"3908\" data-end=\"3950\">Linear combinations of original features<\/p>\n<\/li>\n<li data-start=\"3951\" data-end=\"3982\">\n<p data-start=\"3953\" data-end=\"3982\">Independent from each other<\/p>\n<\/li>\n<li data-start=\"3983\" data-end=\"4008\">\n<p data-start=\"3985\" data-end=\"4008\">Ordered by importance<\/p>\n<\/li>\n<li data-start=\"4010\" data-end=\"4050\">\n<p data-start=\"4012\" data-end=\"4050\"><strong data-start=\"4012\" data-end=\"4019\">PC1<\/strong> \u2192 Captures the most variance<\/p>\n<\/li>\n<li data-start=\"4051\" data-end=\"4098\">\n<p data-start=\"4053\" data-end=\"4098\"><strong data-start=\"4053\" data-end=\"4060\">PC2<\/strong> \u2192 Captures the second most variance<\/p>\n<\/li>\n<li data-start=\"4099\" data-end=\"4111\">\n<p data-start=\"4101\" data-end=\"4111\">And so on\u2026<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4113\" data-end=\"4150\">You keep only the top few components.<\/p>\n<hr data-start=\"4152\" data-end=\"4155\" \/>\n<h2 data-start=\"4157\" data-end=\"4199\"><strong data-start=\"4160\" data-end=\"4199\">5. How Much Data Does PCA Preserve?<\/strong><\/h2>\n<p data-start=\"4201\" data-end=\"4261\">PCA keeps information based on <strong data-start=\"4232\" data-end=\"4260\">explained variance ratio<\/strong>.<\/p>\n<p data-start=\"4263\" data-end=\"4271\">Example:<\/p>\n<ul data-start=\"4272\" data-end=\"4340\">\n<li data-start=\"4272\" data-end=\"4294\">\n<p data-start=\"4274\" data-end=\"4294\">PC1 \u2192 60% variance<\/p>\n<\/li>\n<li data-start=\"4295\" data-end=\"4317\">\n<p data-start=\"4297\" data-end=\"4317\">PC2 \u2192 25% variance<\/p>\n<\/li>\n<li data-start=\"4318\" data-end=\"4340\">\n<p data-start=\"4320\" data-end=\"4340\">PC3 \u2192 10% variance<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4342\" data-end=\"4351\">Together:<\/p>\n<ul data-start=\"4352\" data-end=\"4397\">\n<li data-start=\"4352\" data-end=\"4397\">\n<p data-start=\"4354\" data-end=\"4397\">First 3 PCs = 95% of original information<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4399\" data-end=\"4410\">This means:<\/p>\n<ul data-start=\"4411\" data-end=\"4479\">\n<li data-start=\"4411\" data-end=\"4446\">\n<p data-start=\"4413\" data-end=\"4446\">You reduced 100 features into 3<\/p>\n<\/li>\n<li data-start=\"4447\" data-end=\"4479\">\n<p data-start=\"4449\" data-end=\"4479\">You still kept 95% knowledge<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"4481\" data-end=\"4484\" \/>\n<h2 data-start=\"4486\" data-end=\"4526\"><strong data-start=\"4489\" data-end=\"4526\">6. Where PCA Is Used in Real Life<\/strong><\/h2>\n<hr data-start=\"4528\" data-end=\"4531\" \/>\n<h3 data-start=\"4533\" data-end=\"4562\"><strong data-start=\"4537\" data-end=\"4562\">6.1 Image Compression<\/strong><\/h3>\n<p data-start=\"4564\" data-end=\"4645\">Images contain thousands of pixels. PCA reduces image size while keeping quality.<\/p>\n<p data-start=\"4647\" data-end=\"4655\">Used in:<\/p>\n<ul data-start=\"4656\" data-end=\"4716\">\n<li data-start=\"4656\" data-end=\"4676\">\n<p data-start=\"4658\" data-end=\"4676\">Face recognition<\/p>\n<\/li>\n<li data-start=\"4677\" data-end=\"4694\">\n<p data-start=\"4679\" data-end=\"4694\">Image storage<\/p>\n<\/li>\n<li data-start=\"4695\" data-end=\"4716\">\n<p data-start=\"4697\" data-end=\"4716\">Video compression<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"4718\" data-end=\"4721\" \/>\n<h3 data-start=\"4723\" data-end=\"4753\"><strong data-start=\"4727\" data-end=\"4753\">6.2 Data Visualisation<\/strong><\/h3>\n<p data-start=\"4755\" data-end=\"4768\">PCA converts:<\/p>\n<ul data-start=\"4769\" data-end=\"4853\">\n<li data-start=\"4769\" data-end=\"4804\">\n<p data-start=\"4771\" data-end=\"4804\">High-dimensional financial data<\/p>\n<\/li>\n<li data-start=\"4805\" data-end=\"4825\">\n<p data-start=\"4807\" data-end=\"4825\">Medical datasets<\/p>\n<\/li>\n<li data-start=\"4826\" data-end=\"4853\">\n<p data-start=\"4828\" data-end=\"4853\">Customer behaviour data<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4855\" data-end=\"4882\">Into clear 2D and 3D plots.<\/p>\n<hr data-start=\"4884\" data-end=\"4887\" \/>\n<h3 data-start=\"4889\" data-end=\"4916\"><strong data-start=\"4893\" data-end=\"4916\">6.3 Noise Reduction<\/strong><\/h3>\n<p data-start=\"4918\" data-end=\"5004\">Sensors and signals contain noise. PCA filters weak signals and keeps strong patterns.<\/p>\n<p data-start=\"5006\" data-end=\"5014\">Used in:<\/p>\n<ul data-start=\"5015\" data-end=\"5072\">\n<li data-start=\"5015\" data-end=\"5034\">\n<p data-start=\"5017\" data-end=\"5034\">Medical sensors<\/p>\n<\/li>\n<li data-start=\"5035\" data-end=\"5050\">\n<p data-start=\"5037\" data-end=\"5050\">IoT devices<\/p>\n<\/li>\n<li data-start=\"5051\" data-end=\"5072\">\n<p data-start=\"5053\" data-end=\"5072\">Satellite imagery<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"5074\" data-end=\"5077\" \/>\n<h3 data-start=\"5079\" data-end=\"5129\"><strong data-start=\"5083\" data-end=\"5129\">6.4 Feature Reduction for Machine Learning<\/strong><\/h3>\n<p data-start=\"5131\" data-end=\"5147\">Before training:<\/p>\n<ul data-start=\"5148\" data-end=\"5207\">\n<li data-start=\"5148\" data-end=\"5155\">\n<p data-start=\"5150\" data-end=\"5155\">SVM<\/p>\n<\/li>\n<li data-start=\"5156\" data-end=\"5163\">\n<p data-start=\"5158\" data-end=\"5163\">KNN<\/p>\n<\/li>\n<li data-start=\"5164\" data-end=\"5187\">\n<p data-start=\"5166\" data-end=\"5187\">Logistic Regression<\/p>\n<\/li>\n<li data-start=\"5188\" data-end=\"5207\">\n<p data-start=\"5190\" data-end=\"5207\">Neural Networks<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"5209\" data-end=\"5265\">PCA reduces feature count to improve speed and accuracy.<\/p>\n<hr data-start=\"5267\" data-end=\"5270\" \/>\n<h3 data-start=\"5272\" data-end=\"5309\"><strong data-start=\"5276\" data-end=\"5309\">6.5 Finance and Risk Modeling<\/strong><\/h3>\n<p data-start=\"5311\" data-end=\"5329\">Banks use PCA for:<\/p>\n<ul data-start=\"5330\" data-end=\"5414\">\n<li data-start=\"5330\" data-end=\"5356\">\n<p data-start=\"5332\" data-end=\"5356\">Portfolio optimisation<\/p>\n<\/li>\n<li data-start=\"5357\" data-end=\"5383\">\n<p data-start=\"5359\" data-end=\"5383\">Risk factor clustering<\/p>\n<\/li>\n<li data-start=\"5384\" data-end=\"5414\">\n<p data-start=\"5386\" data-end=\"5414\">Market volatility analysis<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"5416\" data-end=\"5419\" \/>\n<h2 data-start=\"5421\" data-end=\"5448\"><strong data-start=\"5424\" data-end=\"5448\">7. Advantages of PCA<\/strong><\/h2>\n<p data-start=\"5450\" data-end=\"5633\">\u2705 Reduces dataset size<br data-start=\"5472\" data-end=\"5475\" \/>\u2705 Improves training speed<br data-start=\"5500\" data-end=\"5503\" \/>\u2705 Lowers storage cost<br data-start=\"5524\" data-end=\"5527\" \/>\u2705 Reduces noise<br data-start=\"5542\" data-end=\"5545\" \/>\u2705 Improves visualisation<br data-start=\"5569\" data-end=\"5572\" \/>\u2705 Helps fight overfitting<br data-start=\"5597\" data-end=\"5600\" \/>\u2705 Works with most ML algorithms<\/p>\n<hr data-start=\"5635\" data-end=\"5638\" \/>\n<h2 data-start=\"5640\" data-end=\"5668\"><strong data-start=\"5643\" data-end=\"5668\">8. Limitations of PCA<\/strong><\/h2>\n<p data-start=\"5670\" data-end=\"5871\">\u274c PCA removes feature meaning<br data-start=\"5699\" data-end=\"5702\" \/>\u274c Components are hard to interpret<br data-start=\"5736\" data-end=\"5739\" \/>\u274c Works only with numeric features<br data-start=\"5773\" data-end=\"5776\" \/>\u274c Linear transformation only<br data-start=\"5804\" data-end=\"5807\" \/>\u274c Sensitive to scaling<br data-start=\"5829\" data-end=\"5832\" \/>\u274c May remove small but useful signals<\/p>\n<hr data-start=\"5873\" data-end=\"5876\" \/>\n<h2 data-start=\"5878\" data-end=\"5912\"><strong data-start=\"5881\" data-end=\"5912\">9. PCA vs Feature Selection<\/strong><\/h2>\n<div class=\"_tableContainer_1rjym_1\">\n<div class=\"group _tableWrapper_1rjym_13 flex w-fit flex-col-reverse\" tabindex=\"-1\">\n<table class=\"w-fit min-w-(--thread-content-width)\" data-start=\"5914\" data-end=\"6236\">\n<thead data-start=\"5914\" data-end=\"5951\">\n<tr data-start=\"5914\" data-end=\"5951\">\n<th data-start=\"5914\" data-end=\"5924\" data-col-size=\"sm\">Feature<\/th>\n<th data-start=\"5924\" data-end=\"5930\" data-col-size=\"sm\">PCA<\/th>\n<th data-start=\"5930\" data-end=\"5951\" data-col-size=\"sm\">Feature Selection<\/th>\n<\/tr>\n<\/thead>\n<tbody data-start=\"5988\" data-end=\"6236\">\n<tr data-start=\"5988\" data-end=\"6041\">\n<td data-start=\"5988\" data-end=\"5997\" data-col-size=\"sm\">Method<\/td>\n<td data-start=\"5997\" data-end=\"6022\" data-col-size=\"sm\">Feature transformation<\/td>\n<td data-start=\"6022\" data-end=\"6041\" data-col-size=\"sm\">Feature removal<\/td>\n<\/tr>\n<tr data-start=\"6042\" data-end=\"6075\">\n<td data-start=\"6042\" data-end=\"6061\" data-col-size=\"sm\">Interpretability<\/td>\n<td data-start=\"6061\" data-end=\"6067\" data-col-size=\"sm\">Low<\/td>\n<td data-start=\"6067\" data-end=\"6075\" data-col-size=\"sm\">High<\/td>\n<\/tr>\n<tr data-start=\"6076\" data-end=\"6113\">\n<td data-start=\"6076\" data-end=\"6094\" data-col-size=\"sm\">Noise reduction<\/td>\n<td data-start=\"6094\" data-end=\"6103\" data-col-size=\"sm\">Strong<\/td>\n<td data-start=\"6103\" data-end=\"6113\" data-col-size=\"sm\">Medium<\/td>\n<\/tr>\n<tr data-start=\"6114\" data-end=\"6152\">\n<td data-start=\"6114\" data-end=\"6130\" data-col-size=\"sm\">Visualisation<\/td>\n<td data-start=\"6130\" data-end=\"6144\" data-col-size=\"sm\">Very strong<\/td>\n<td data-start=\"6144\" data-end=\"6152\" data-col-size=\"sm\">Weak<\/td>\n<\/tr>\n<tr data-start=\"6153\" data-end=\"6189\">\n<td data-start=\"6153\" data-end=\"6165\" data-col-size=\"sm\">Data loss<\/td>\n<td data-start=\"6165\" data-end=\"6178\" data-col-size=\"sm\">Controlled<\/td>\n<td data-start=\"6178\" data-end=\"6189\" data-col-size=\"sm\">Depends<\/td>\n<\/tr>\n<tr data-start=\"6190\" data-end=\"6236\">\n<td data-start=\"6190\" data-end=\"6201\" data-col-size=\"sm\">Best for<\/td>\n<td data-start=\"6201\" data-end=\"6218\" data-col-size=\"sm\">Large datasets<\/td>\n<td data-start=\"6218\" data-end=\"6236\" data-col-size=\"sm\">Small datasets<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p data-start=\"6238\" data-end=\"6284\">Both techniques are important in ML pipelines.<\/p>\n<hr data-start=\"6286\" data-end=\"6289\" \/>\n<h2 data-start=\"6291\" data-end=\"6343\"><strong data-start=\"6294\" data-end=\"6343\">10. PCA vs LDA (Linear Discriminant Analysis)<\/strong><\/h2>\n<div class=\"_tableContainer_1rjym_1\">\n<div class=\"group _tableWrapper_1rjym_13 flex w-fit flex-col-reverse\" tabindex=\"-1\">\n<table class=\"w-fit min-w-(--thread-content-width)\" data-start=\"6345\" data-end=\"6571\">\n<thead data-start=\"6345\" data-end=\"6368\">\n<tr data-start=\"6345\" data-end=\"6368\">\n<th data-start=\"6345\" data-end=\"6355\" data-col-size=\"sm\">Feature<\/th>\n<th data-start=\"6355\" data-end=\"6361\" data-col-size=\"sm\">PCA<\/th>\n<th data-start=\"6361\" data-end=\"6368\" data-col-size=\"sm\">LDA<\/th>\n<\/tr>\n<\/thead>\n<tbody data-start=\"6392\" data-end=\"6571\">\n<tr data-start=\"6392\" data-end=\"6428\">\n<td data-start=\"6392\" data-end=\"6399\" data-col-size=\"sm\">Type<\/td>\n<td data-start=\"6399\" data-end=\"6414\" data-col-size=\"sm\">Unsupervised<\/td>\n<td data-start=\"6414\" data-end=\"6428\" data-col-size=\"sm\">Supervised<\/td>\n<\/tr>\n<tr data-start=\"6429\" data-end=\"6455\">\n<td data-start=\"6429\" data-end=\"6443\" data-col-size=\"sm\">Uses labels<\/td>\n<td data-start=\"6443\" data-end=\"6448\" data-col-size=\"sm\">No<\/td>\n<td data-start=\"6448\" data-end=\"6455\" data-col-size=\"sm\">Yes<\/td>\n<\/tr>\n<tr data-start=\"6456\" data-end=\"6512\">\n<td data-start=\"6456\" data-end=\"6463\" data-col-size=\"sm\">Goal<\/td>\n<td data-start=\"6463\" data-end=\"6483\" data-col-size=\"sm\">Maximise variance<\/td>\n<td data-start=\"6483\" data-end=\"6512\" data-col-size=\"sm\">Maximise class separation<\/td>\n<\/tr>\n<tr data-start=\"6513\" data-end=\"6571\">\n<td data-start=\"6513\" data-end=\"6524\" data-col-size=\"sm\">Use case<\/td>\n<td data-start=\"6524\" data-end=\"6553\" data-col-size=\"sm\">Visualisation, compression<\/td>\n<td data-start=\"6553\" data-end=\"6571\" data-col-size=\"sm\">Classification<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<hr data-start=\"6573\" data-end=\"6576\" \/>\n<h2 data-start=\"6578\" data-end=\"6625\"><strong data-start=\"6581\" data-end=\"6625\">11. How Many Components Should You Keep?<\/strong><\/h2>\n<p data-start=\"6627\" data-end=\"6631\">Use:<\/p>\n<p data-start=\"6633\" data-end=\"6738\">\u2705 <strong data-start=\"6635\" data-end=\"6662\">Explained variance plot<\/strong><br data-start=\"6662\" data-end=\"6665\" \/>\u2705 <strong data-start=\"6667\" data-end=\"6691\">Elbow method for PCA<\/strong><br data-start=\"6691\" data-end=\"6694\" \/>\u2705 <strong data-start=\"6696\" data-end=\"6738\">Cumulative variance threshold (90\u201395%)<\/strong><\/p>\n<p data-start=\"6740\" data-end=\"6754\">Best practice:<\/p>\n<ul data-start=\"6755\" data-end=\"6812\">\n<li data-start=\"6755\" data-end=\"6812\">\n<p data-start=\"6757\" data-end=\"6812\">Keep components that preserve at least <strong data-start=\"6796\" data-end=\"6812\">90% variance<\/strong><\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"6814\" data-end=\"6817\" \/>\n<h2 data-start=\"6819\" data-end=\"6861\"><strong data-start=\"6822\" data-end=\"6861\">12. PCA and Machine Learning Models<\/strong><\/h2>\n<p data-start=\"6863\" data-end=\"6892\">PCA improves many algorithms:<\/p>\n<hr data-start=\"6894\" data-end=\"6897\" \/>\n<h3 data-start=\"6899\" data-end=\"6915\"><strong data-start=\"6903\" data-end=\"6915\">With KNN<\/strong><\/h3>\n<ul data-start=\"6916\" data-end=\"6987\">\n<li data-start=\"6916\" data-end=\"6950\">\n<p data-start=\"6918\" data-end=\"6950\">Speeds up distance computation<\/p>\n<\/li>\n<li data-start=\"6951\" data-end=\"6987\">\n<p data-start=\"6953\" data-end=\"6987\">Improves classification accuracy<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"6989\" data-end=\"6992\" \/>\n<h3 data-start=\"6994\" data-end=\"7010\"><strong data-start=\"6998\" data-end=\"7010\">With SVM<\/strong><\/h3>\n<ul data-start=\"7011\" data-end=\"7073\">\n<li data-start=\"7011\" data-end=\"7041\">\n<p data-start=\"7013\" data-end=\"7041\">Reduces computational load<\/p>\n<\/li>\n<li data-start=\"7042\" data-end=\"7073\">\n<p data-start=\"7044\" data-end=\"7073\">Makes kernel methods faster<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"7075\" data-end=\"7078\" \/>\n<h3 data-start=\"7080\" data-end=\"7112\"><strong data-start=\"7084\" data-end=\"7112\">With Logistic Regression<\/strong><\/h3>\n<ul data-start=\"7113\" data-end=\"7173\">\n<li data-start=\"7113\" data-end=\"7144\">\n<p data-start=\"7115\" data-end=\"7144\">Removes correlated features<\/p>\n<\/li>\n<li data-start=\"7145\" data-end=\"7173\">\n<p data-start=\"7147\" data-end=\"7173\">Improves model stability<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"7175\" data-end=\"7178\" \/>\n<h3 data-start=\"7180\" data-end=\"7208\"><strong data-start=\"7184\" data-end=\"7208\">With Neural Networks<\/strong><\/h3>\n<ul data-start=\"7209\" data-end=\"7259\">\n<li data-start=\"7209\" data-end=\"7234\">\n<p data-start=\"7211\" data-end=\"7234\">Reduces training time<\/p>\n<\/li>\n<li data-start=\"7235\" data-end=\"7259\">\n<p data-start=\"7237\" data-end=\"7259\">Improves convergence<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"7261\" data-end=\"7264\" \/>\n<h2 data-start=\"7266\" data-end=\"7298\"><strong data-start=\"7269\" data-end=\"7298\">13. Practical PCA Example<\/strong><\/h2>\n<h3 data-start=\"7300\" data-end=\"7334\"><strong data-start=\"7304\" data-end=\"7334\">Customer Behaviour Dataset<\/strong><\/h3>\n<p data-start=\"7336\" data-end=\"7354\">Original features:<\/p>\n<ul data-start=\"7355\" data-end=\"7459\">\n<li data-start=\"7355\" data-end=\"7365\">\n<p data-start=\"7357\" data-end=\"7365\">Income<\/p>\n<\/li>\n<li data-start=\"7366\" data-end=\"7373\">\n<p data-start=\"7368\" data-end=\"7373\">Age<\/p>\n<\/li>\n<li data-start=\"7374\" data-end=\"7393\">\n<p data-start=\"7376\" data-end=\"7393\">Visit frequency<\/p>\n<\/li>\n<li data-start=\"7394\" data-end=\"7414\">\n<p data-start=\"7396\" data-end=\"7414\">Purchase history<\/p>\n<\/li>\n<li data-start=\"7415\" data-end=\"7432\">\n<p data-start=\"7417\" data-end=\"7432\">Browsing time<\/p>\n<\/li>\n<li data-start=\"7433\" data-end=\"7459\">\n<p data-start=\"7435\" data-end=\"7459\">Product category count<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"7461\" data-end=\"7471\">After PCA:<\/p>\n<ul data-start=\"7472\" data-end=\"7570\">\n<li data-start=\"7472\" data-end=\"7499\">\n<p data-start=\"7474\" data-end=\"7499\">Reduced to 2 components<\/p>\n<\/li>\n<li data-start=\"7500\" data-end=\"7535\">\n<p data-start=\"7502\" data-end=\"7535\">Visualised in a 2D scatter plot<\/p>\n<\/li>\n<li data-start=\"7536\" data-end=\"7570\">\n<p data-start=\"7538\" data-end=\"7570\">Clear customer clusters appear<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"7572\" data-end=\"7619\">Marketing teams use this insight for targeting.<\/p>\n<hr data-start=\"7621\" data-end=\"7624\" \/>\n<h2 data-start=\"7626\" data-end=\"7665\"><strong data-start=\"7629\" data-end=\"7665\">14. PCA in High-Dimensional Data<\/strong><\/h2>\n<p data-start=\"7667\" data-end=\"7700\">High-dimensional data appears in:<\/p>\n<ul data-start=\"7701\" data-end=\"7795\">\n<li data-start=\"7701\" data-end=\"7713\">\n<p data-start=\"7703\" data-end=\"7713\">Genomics<\/p>\n<\/li>\n<li data-start=\"7714\" data-end=\"7734\">\n<p data-start=\"7716\" data-end=\"7734\">Satellite images<\/p>\n<\/li>\n<li data-start=\"7735\" data-end=\"7753\">\n<p data-start=\"7737\" data-end=\"7753\">NLP embeddings<\/p>\n<\/li>\n<li data-start=\"7754\" data-end=\"7773\">\n<p data-start=\"7756\" data-end=\"7773\">Sensor networks<\/p>\n<\/li>\n<li data-start=\"7774\" data-end=\"7795\">\n<p data-start=\"7776\" data-end=\"7795\">Financial markets<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"7797\" data-end=\"7825\">PCA reduces dimensions from:<\/p>\n<ul data-start=\"7826\" data-end=\"7857\">\n<li data-start=\"7826\" data-end=\"7840\">\n<p data-start=\"7828\" data-end=\"7840\">1,000 \u2192 50<\/p>\n<\/li>\n<li data-start=\"7841\" data-end=\"7857\">\n<p data-start=\"7843\" data-end=\"7857\">10,000 \u2192 100<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"7859\" data-end=\"7893\">This makes AI processing possible.<\/p>\n<hr data-start=\"7895\" data-end=\"7898\" \/>\n<h2 data-start=\"7900\" data-end=\"7938\"><strong data-start=\"7903\" data-end=\"7938\">15. Tools Used to Implement PCA<\/strong><\/h2>\n<p data-start=\"7940\" data-end=\"8038\">The most widely used PCA implementation is available in <strong data-start=\"7996\" data-end=\"8037\"><span class=\"hover:entity-accent entity-underline inline cursor-pointer align-baseline\"><span class=\"whitespace-normal\">scikit-learn<\/span><\/span><\/strong>.<\/p>\n<p data-start=\"8040\" data-end=\"8052\">It provides:<\/p>\n<ul data-start=\"8053\" data-end=\"8134\">\n<li data-start=\"8053\" data-end=\"8065\">\n<p data-start=\"8055\" data-end=\"8065\">Fast PCA<\/p>\n<\/li>\n<li data-start=\"8066\" data-end=\"8085\">\n<p data-start=\"8068\" data-end=\"8085\">Incremental PCA<\/p>\n<\/li>\n<li data-start=\"8086\" data-end=\"8104\">\n<p data-start=\"8088\" data-end=\"8104\">Randomised PCA<\/p>\n<\/li>\n<li data-start=\"8105\" data-end=\"8134\">\n<p data-start=\"8107\" data-end=\"8134\">Easy pipeline integration<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"8136\" data-end=\"8139\" \/>\n<h2 data-start=\"8141\" data-end=\"8176\"><strong data-start=\"8144\" data-end=\"8176\">16. When Should You Use PCA?<\/strong><\/h2>\n<p data-start=\"8178\" data-end=\"8193\">\u2705 Use PCA when:<\/p>\n<ul data-start=\"8194\" data-end=\"8358\">\n<li data-start=\"8194\" data-end=\"8228\">\n<p data-start=\"8196\" data-end=\"8228\">You have many numeric features<\/p>\n<\/li>\n<li data-start=\"8229\" data-end=\"8246\">\n<p data-start=\"8231\" data-end=\"8246\">Data is noisy<\/p>\n<\/li>\n<li data-start=\"8247\" data-end=\"8273\">\n<p data-start=\"8249\" data-end=\"8273\">You want faster models<\/p>\n<\/li>\n<li data-start=\"8274\" data-end=\"8309\">\n<p data-start=\"8276\" data-end=\"8309\">You need 2D or 3D visualisation<\/p>\n<\/li>\n<li data-start=\"8310\" data-end=\"8335\">\n<p data-start=\"8312\" data-end=\"8335\">Models overfit easily<\/p>\n<\/li>\n<li data-start=\"8336\" data-end=\"8358\">\n<p data-start=\"8338\" data-end=\"8358\">You use KNN or SVM<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"8360\" data-end=\"8363\" \/>\n<h2 data-start=\"8365\" data-end=\"8402\"><strong data-start=\"8368\" data-end=\"8402\">17. When Should You Avoid PCA?<\/strong><\/h2>\n<p data-start=\"8404\" data-end=\"8421\">\u274c Avoid PCA when:<\/p>\n<ul data-start=\"8422\" data-end=\"8579\">\n<li data-start=\"8422\" data-end=\"8453\">\n<p data-start=\"8424\" data-end=\"8453\">Feature meaning is critical<\/p>\n<\/li>\n<li data-start=\"8454\" data-end=\"8477\">\n<p data-start=\"8456\" data-end=\"8477\">Data is categorical<\/p>\n<\/li>\n<li data-start=\"8478\" data-end=\"8506\">\n<p data-start=\"8480\" data-end=\"8506\">Dataset is already small<\/p>\n<\/li>\n<li data-start=\"8507\" data-end=\"8542\">\n<p data-start=\"8509\" data-end=\"8542\">You require full explainability<\/p>\n<\/li>\n<li data-start=\"8543\" data-end=\"8579\">\n<p data-start=\"8545\" data-end=\"8579\">Features are already independent<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"8581\" data-end=\"8584\" \/>\n<h2 data-start=\"8586\" data-end=\"8622\"><strong data-start=\"8589\" data-end=\"8622\">18. PCA in Production Systems<\/strong><\/h2>\n<p data-start=\"8624\" data-end=\"8632\">Used in:<\/p>\n<ul data-start=\"8633\" data-end=\"8772\">\n<li data-start=\"8633\" data-end=\"8662\">\n<p data-start=\"8635\" data-end=\"8662\">Fraud detection pipelines<\/p>\n<\/li>\n<li data-start=\"8663\" data-end=\"8691\">\n<p data-start=\"8665\" data-end=\"8691\">Face recognition systems<\/p>\n<\/li>\n<li data-start=\"8692\" data-end=\"8716\">\n<p data-start=\"8694\" data-end=\"8716\">Credit scoring tools<\/p>\n<\/li>\n<li data-start=\"8717\" data-end=\"8743\">\n<p data-start=\"8719\" data-end=\"8743\">Recommendation engines<\/p>\n<\/li>\n<li data-start=\"8744\" data-end=\"8772\">\n<p data-start=\"8746\" data-end=\"8772\">Cybersecurity monitoring<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"8774\" data-end=\"8786\">It improves:<\/p>\n<ul data-start=\"8787\" data-end=\"8843\">\n<li data-start=\"8787\" data-end=\"8796\">\n<p data-start=\"8789\" data-end=\"8796\">Speed<\/p>\n<\/li>\n<li data-start=\"8797\" data-end=\"8809\">\n<p data-start=\"8799\" data-end=\"8809\">Accuracy<\/p>\n<\/li>\n<li data-start=\"8810\" data-end=\"8823\">\n<p data-start=\"8812\" data-end=\"8823\">Stability<\/p>\n<\/li>\n<li data-start=\"8824\" data-end=\"8843\">\n<p data-start=\"8826\" data-end=\"8843\">Cost efficiency<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"8845\" data-end=\"8848\" \/>\n<h2 data-start=\"8850\" data-end=\"8883\"><strong data-start=\"8853\" data-end=\"8883\">19. Business Impact of PCA<\/strong><\/h2>\n<p data-start=\"8885\" data-end=\"8906\">PCA helps businesses:<\/p>\n<ul data-start=\"8907\" data-end=\"9090\">\n<li data-start=\"8907\" data-end=\"8937\">\n<p data-start=\"8909\" data-end=\"8937\">Reduce infrastructure cost<\/p>\n<\/li>\n<li data-start=\"8938\" data-end=\"8963\">\n<p data-start=\"8940\" data-end=\"8963\">Speed up AI pipelines<\/p>\n<\/li>\n<li data-start=\"8964\" data-end=\"8994\">\n<p data-start=\"8966\" data-end=\"8994\">Improve prediction quality<\/p>\n<\/li>\n<li data-start=\"8995\" data-end=\"9026\">\n<p data-start=\"8997\" data-end=\"9026\">Visualise customer segments<\/p>\n<\/li>\n<li data-start=\"9027\" data-end=\"9057\">\n<p data-start=\"9029\" data-end=\"9057\">Improve security detection<\/p>\n<\/li>\n<li data-start=\"9058\" data-end=\"9090\">\n<p data-start=\"9060\" data-end=\"9090\">Optimise financial modelling<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"9092\" data-end=\"9139\">It increases <strong data-start=\"9105\" data-end=\"9138\">AI efficiency with lower cost<\/strong>.<\/p>\n<hr data-start=\"9141\" data-end=\"9144\" \/>\n<h1 data-start=\"9146\" data-end=\"9162\"><strong data-start=\"9148\" data-end=\"9162\">Conclusion<\/strong><\/h1>\n<p data-start=\"9164\" data-end=\"9461\">PCA is one of the most powerful tools in modern machine learning. It reduces dimensionality while preserving the most important information. PCA improves model speed, accuracy, and visualisation all at once. It also helps fight the curse of dimensionality and reduces noise in real-world datasets.<\/p>\n<p data-start=\"9463\" data-end=\"9600\">From finance and healthcare to cybersecurity and image processing, PCA remains a foundational technique every data scientist must master.<\/p>\n<hr data-start=\"9602\" data-end=\"9605\" \/>\n<h1 data-start=\"9607\" data-end=\"9627\"><strong data-start=\"9609\" data-end=\"9627\">Call to Action<\/strong><\/h1>\n<p data-start=\"9629\" data-end=\"9807\"><strong data-start=\"9629\" data-end=\"9764\">Want to master PCA, dimensionality reduction, and advanced ML pipelines?<br data-start=\"9703\" data-end=\"9706\" \/>Explore our full AI &amp; Data Science course library below:<\/strong><br data-start=\"9764\" data-end=\"9767\" \/><a href=\"https:\/\/uplatz.com\/online-courses?global-search=data+science\">https:\/\/uplatz.com\/online-courses?global-search=data+science<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>PCA (Dimensionality Reduction): A Complete Practical Guide Modern machine learning works with large datasets that may contain hundreds or even thousands of features. While more data can improve predictions, too <span class=\"readmore\"><a href=\"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/\">Read More &#8230;<\/a><\/span><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[170],"tags":[],"class_list":["post-7765","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>PCA (Dimensionality Reduction) Explained | Uplatz Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"PCA (Dimensionality Reduction) Explained | Uplatz Blog\" \/>\n<meta property=\"og:description\" content=\"PCA (Dimensionality Reduction): A Complete Practical Guide Modern machine learning works with large datasets that may contain hundreds or even thousands of features. While more data can improve predictions, too Read More ...\" \/>\n<meta property=\"og:url\" content=\"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/\" \/>\n<meta property=\"og:site_name\" content=\"Uplatz Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/Uplatz-1077816825610769\/\" \/>\n<meta property=\"article:published_time\" content=\"2025-11-26T18:36:57+00:00\" \/>\n<meta name=\"author\" content=\"uplatzblog\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@uplatz_global\" \/>\n<meta name=\"twitter:site\" content=\"@uplatz_global\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"uplatzblog\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/\"},\"author\":{\"name\":\"uplatzblog\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#\\\/schema\\\/person\\\/8ecae69a21d0757bdb2f776e67d2645e\"},\"headline\":\"PCA (Dimensionality Reduction) Explained\",\"datePublished\":\"2025-11-26T18:36:57+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/\"},\"wordCount\":1076,\"publisher\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#organization\"},\"articleSection\":[\"Artificial Intelligence\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/\",\"url\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/\",\"name\":\"PCA (Dimensionality Reduction) Explained | Uplatz Blog\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#website\"},\"datePublished\":\"2025-11-26T18:36:57+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/pca-dimensionality-reduction-explained\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"PCA (Dimensionality Reduction) Explained\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/\",\"name\":\"Uplatz Blog\",\"description\":\"Uplatz is a global IT Training &amp; Consulting company\",\"publisher\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#organization\",\"name\":\"uplatz.com\",\"url\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/wp-content\\\/uploads\\\/2016\\\/11\\\/Uplatz-Logo-Copy-2.png\",\"contentUrl\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/wp-content\\\/uploads\\\/2016\\\/11\\\/Uplatz-Logo-Copy-2.png\",\"width\":1280,\"height\":800,\"caption\":\"uplatz.com\"},\"image\":{\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/Uplatz-1077816825610769\\\/\",\"https:\\\/\\\/x.com\\\/uplatz_global\",\"https:\\\/\\\/www.instagram.com\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/7956715?trk=tyah&amp;amp;amp;amp;trkInfo=clickedVertical:company,clickedEntityId:7956715,idx:1-1-1,tarId:1464353969447,tas:uplatz\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/uplatz.com\\\/blog\\\/#\\\/schema\\\/person\\\/8ecae69a21d0757bdb2f776e67d2645e\",\"name\":\"uplatzblog\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g\",\"caption\":\"uplatzblog\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"PCA (Dimensionality Reduction) Explained | Uplatz Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/","og_locale":"en_US","og_type":"article","og_title":"PCA (Dimensionality Reduction) Explained | Uplatz Blog","og_description":"PCA (Dimensionality Reduction): A Complete Practical Guide Modern machine learning works with large datasets that may contain hundreds or even thousands of features. While more data can improve predictions, too Read More ...","og_url":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/","og_site_name":"Uplatz Blog","article_publisher":"https:\/\/www.facebook.com\/Uplatz-1077816825610769\/","article_published_time":"2025-11-26T18:36:57+00:00","author":"uplatzblog","twitter_card":"summary_large_image","twitter_creator":"@uplatz_global","twitter_site":"@uplatz_global","twitter_misc":{"Written by":"uplatzblog","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/#article","isPartOf":{"@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/"},"author":{"name":"uplatzblog","@id":"https:\/\/uplatz.com\/blog\/#\/schema\/person\/8ecae69a21d0757bdb2f776e67d2645e"},"headline":"PCA (Dimensionality Reduction) Explained","datePublished":"2025-11-26T18:36:57+00:00","mainEntityOfPage":{"@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/"},"wordCount":1076,"publisher":{"@id":"https:\/\/uplatz.com\/blog\/#organization"},"articleSection":["Artificial Intelligence"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/","url":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/","name":"PCA (Dimensionality Reduction) Explained | Uplatz Blog","isPartOf":{"@id":"https:\/\/uplatz.com\/blog\/#website"},"datePublished":"2025-11-26T18:36:57+00:00","breadcrumb":{"@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/uplatz.com\/blog\/pca-dimensionality-reduction-explained\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/uplatz.com\/blog\/"},{"@type":"ListItem","position":2,"name":"PCA (Dimensionality Reduction) Explained"}]},{"@type":"WebSite","@id":"https:\/\/uplatz.com\/blog\/#website","url":"https:\/\/uplatz.com\/blog\/","name":"Uplatz Blog","description":"Uplatz is a global IT Training &amp; Consulting company","publisher":{"@id":"https:\/\/uplatz.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/uplatz.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/uplatz.com\/blog\/#organization","name":"uplatz.com","url":"https:\/\/uplatz.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/uplatz.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/uplatz.com\/blog\/wp-content\/uploads\/2016\/11\/Uplatz-Logo-Copy-2.png","contentUrl":"https:\/\/uplatz.com\/blog\/wp-content\/uploads\/2016\/11\/Uplatz-Logo-Copy-2.png","width":1280,"height":800,"caption":"uplatz.com"},"image":{"@id":"https:\/\/uplatz.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/Uplatz-1077816825610769\/","https:\/\/x.com\/uplatz_global","https:\/\/www.instagram.com\/","https:\/\/www.linkedin.com\/company\/7956715?trk=tyah&amp;amp;amp;amp;trkInfo=clickedVertical:company,clickedEntityId:7956715,idx:1-1-1,tarId:1464353969447,tas:uplatz"]},{"@type":"Person","@id":"https:\/\/uplatz.com\/blog\/#\/schema\/person\/8ecae69a21d0757bdb2f776e67d2645e","name":"uplatzblog","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/7f814c72279199f59ded4418a8653ad15f5f8904ac75e025a4e2abe24d58fa5d?s=96&d=mm&r=g","caption":"uplatzblog"}}]}},"_links":{"self":[{"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/posts\/7765","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/comments?post=7765"}],"version-history":[{"count":1,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/posts\/7765\/revisions"}],"predecessor-version":[{"id":7766,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/posts\/7765\/revisions\/7766"}],"wp:attachment":[{"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/media?parent=7765"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/categories?post=7765"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/uplatz.com\/blog\/wp-json\/wp\/v2\/tags?post=7765"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}