Please use this identifier to cite or link to this item:
Title: Matrix Approximation Algorithms and Its Applications
Authors: Patel, Lokendra Singh
Saha, Suman [Guided by]
Keywords: Linear algebra
Vector terminology
Matrix approximation
Linear kernel
Truncated SVD
Issue Date: 2015
Publisher: Jaypee University of Information Technology, Solan, H.P.
Abstract: Large data sets have tens of thousands to millions of training instances, which suffers from high time and space complexity. To reduce the time and space complexity, we propose efficient Nystrom method to approximate kernel matrix, which is used in many machine learning methods such as kernel-based methods, e.g. Kernel Ridge Regression, Kernel Principle Component Analysis and Support Vector Machine. This thesis focuses on sampling based matrix approximation methods. Matrix approximation will help to speed up the kernel based algorithms to large data set. We give the desirable error bound both in the Frobenius and spectral norm for the quality of approximation. Based on these error bounds, we analyze the quality of approximation in kernel based algorithms. We present guarantees on approximation accuracy based on various matrix properties and analyze the effect of matrix approximation on actual kernel-based algorithms. Our proposed algorithm gives the lower error bound for the low rank approximation of the kernel matrix
Appears in Collections:Dissertations (M.Tech.)

Files in This Item:
File Description SizeFormat 
Matrix Approximation Algorithms and Its Applications.pdf1.12 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.