Traditional Dimensionality Reduction Techniques using Deep Learning
K.M.Monica1 , R.Parvathi2
1K.M.Monca, Research Scholar ,Scse, Vit, Chennai-600127.
R.Parvathy, Associate Professor, Scse, Vit , Chennai-600127.
Manuscript received on 08 August 2019. | Revised Manuscript received on 17 August 2019. | Manuscript published on 30 September 2019. | PP: 7153-7160 | Volume-8 Issue-3 September 2019 | Retrieval Number: C6110098319/2019©BEIESP | DOI: 10.35940/ijrte.C6110.098319
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: From the analysis of big data, dimensionality reduction techniques play a significant role in various fields where the data is huge with multiple columns or classes. Data with high dimensions contains thousands of features where many of these features contain useful information. Along with this there contains a lot of redundant or irrelevant features which reduce the quality, performance of data and decrease the efficiency in computation. Procedures which are done mathematically for reducing dimensions are known as dimensionality reduction techniques. The main aim of the Dimensionality Reduction algorithms such as Principal Component Analysis (PCA), Random Projection (RP) and Non Negative Matrix Factorization (NMF) is used to decrease the inappropriate information from the data and moreover the features and attributes taken from these algorithms were not able to characterize data as different divisions. This paper gives a review about the traditional methods used in Machine algorithm for reducing the dimension and proposes a view, how deep learning can be used for dimensionality reduction.
Keywords: Dimensionality Reduction, Principal Component Analysis, Non Negative Matrix Factorization, Random Projection, Variables, Deep Learning.
Scope of the Article: Deep Learning