Path: Top -> Journal -> Jurnal Internasional -> King Saud University -> 2021 -> Volume 33, Issue 4, May

Accelerated Singular Value Decomposition (ASVD) using momentum based Gradient Descent Optimization

Journal from gdlhub / 2022-02-12 16:24:59
Oleh : Sandeep Kumar Raghuwanshi, Rajesh Kumar Pateriya, King Saud University
Dibuat : 2022-02-12, dengan 0 file

Keyword : Gradient Descent, Information filtering, Matrix factorization, Singular value decomposition, Stochastic gradient descent
Url : http://www.sciencedirect.com/science/article/pii/S1319157818300636
Sumber pengambilan dokumen : web

The limitations of neighborhood-based Collaborative Filtering (CF) techniques over scalable and sparse data present obstacle for efficient recommendation systems. These techniques show poor accuracy and dismal speed in generating recommendations. Model-based matrix factorization is an alternative approach use to overcome aforementioned limitations of CF.

Singular value decomposition (SVD) is widely used technique to get low-rank factors of rating matrix and use Gradient Descent (GD) or Alternative Least Square (ALS) for optimization of its error objective function. Most researchers have focused on the accuracy of predictions but they did not accumulate the convergence rate of learning approach. In this paper, we propose a new filtering technique that implements SVD using Stochastic Gradient Descent (SGD) optimization and provides an accelerated version of SVD for fast convergence of learning parameters with improved classification accuracy. Our proposed method accelerates SVD in the right direction and dampens oscillation by adding a momentum value in parameters updates. To support our claim, we have tested our proposed model against the famed real world datasets (MovieLens100k, FilmTrust and YahooMovie). The proposed Accelerated Singular Value Decomposition (ASVD) outperformed the existing models and achieved higher convergence rate and better classification accuracy.

Deskripsi Alternatif :

The limitations of neighborhood-based Collaborative Filtering (CF) techniques over scalable and sparse data present obstacle for efficient recommendation systems. These techniques show poor accuracy and dismal speed in generating recommendations. Model-based matrix factorization is an alternative approach use to overcome aforementioned limitations of CF.

Singular value decomposition (SVD) is widely used technique to get low-rank factors of rating matrix and use Gradient Descent (GD) or Alternative Least Square (ALS) for optimization of its error objective function. Most researchers have focused on the accuracy of predictions but they did not accumulate the convergence rate of learning approach. In this paper, we propose a new filtering technique that implements SVD using Stochastic Gradient Descent (SGD) optimization and provides an accelerated version of SVD for fast convergence of learning parameters with improved classification accuracy. Our proposed method accelerates SVD in the right direction and dampens oscillation by adding a momentum value in parameters updates. To support our claim, we have tested our proposed model against the famed real world datasets (MovieLens100k, FilmTrust and YahooMovie). The proposed Accelerated Singular Value Decomposition (ASVD) outperformed the existing models and achieved higher convergence rate and better classification accuracy.

Beri Komentar ?#(0) | Bookmark

PropertiNilai Properti
ID Publishergdlhub
OrganisasiKing Saud University
Nama KontakHerti Yani, S.Kom
AlamatJln. Jenderal Sudirman
KotaJambi
DaerahJambi
NegaraIndonesia
Telepon0741-35095
Fax0741-35093
E-mail Administratorelibrarystikom@gmail.com
E-mail CKOelibrarystikom@gmail.com

Print ...

Kontributor...

  • Editor: Calvin