Retracted: A Self-Attention Mask Learning-Based Recommendation System

The primary purpose of sequence modeling is to record long-term interdependence across interaction sequences, and since the number of items purchased by users gradually increases over time, this brings challenges to sequence modeling to a certain extent. Relationships between terms are often overloo...

Full description

Saved in:
Bibliographic Details
Main Authors: Abeer Aljohani, Mohamed Ali Rakrouki, Nawaf Alharbe, Reyadh Alluhaibi
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Online Access:https://ieeexplore.ieee.org/document/9869668/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850117123747086336
author Abeer Aljohani
Mohamed Ali Rakrouki
Nawaf Alharbe
Reyadh Alluhaibi
author_facet Abeer Aljohani
Mohamed Ali Rakrouki
Nawaf Alharbe
Reyadh Alluhaibi
author_sort Abeer Aljohani
collection DOAJ
description The primary purpose of sequence modeling is to record long-term interdependence across interaction sequences, and since the number of items purchased by users gradually increases over time, this brings challenges to sequence modeling to a certain extent. Relationships between terms are often overlooked, and it is crucial to build sequential models that effectively capture long-term dependencies. Existing methods focus on extracting global sequential information, while ignoring deep representations from subsequences. We argue that limited item transfer is fundamental to sequence modeling, and that partial substructures of sequences can help models learn more efficient long-term dependencies compared to entire sequences. This paper proposes a sequence recommendation model named GAT4Rec (Gated Recurrent Unit And Transformer For Recommendation), which uses a Transformer layer that shares parameters across layers to model the user’s historical interaction sequence. The representation learned by the gated recurrent unit is used as a gating signal to filter out better substructures of the user sequence. The experimental results demonstrate that our proposed GAT4Rec model is superior to other models and has a higher recommendation effectiveness.
format Article
id doaj-art-be72cbe838714c2aa7c32b6a61a42e30
institution OA Journals
issn 2169-3536
language English
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-be72cbe838714c2aa7c32b6a61a42e302025-08-20T02:36:09ZengIEEEIEEE Access2169-35362022-01-0110930179302810.1109/ACCESS.2022.32026379869668Retracted: A Self-Attention Mask Learning-Based Recommendation SystemAbeer Aljohani0Mohamed Ali Rakrouki1https://orcid.org/0000-0001-6627-9161Nawaf Alharbe2https://orcid.org/0000-0002-1900-420XReyadh Alluhaibi3Applied College, Taibah University, Madinah, Saudi ArabiaApplied College, Taibah University, Madinah, Saudi ArabiaApplied College, Taibah University, Madinah, Saudi ArabiaDepartment of Computer Science, College of Computer Science and Engineering, Taibah University, Madinah, Saudi ArabiaThe primary purpose of sequence modeling is to record long-term interdependence across interaction sequences, and since the number of items purchased by users gradually increases over time, this brings challenges to sequence modeling to a certain extent. Relationships between terms are often overlooked, and it is crucial to build sequential models that effectively capture long-term dependencies. Existing methods focus on extracting global sequential information, while ignoring deep representations from subsequences. We argue that limited item transfer is fundamental to sequence modeling, and that partial substructures of sequences can help models learn more efficient long-term dependencies compared to entire sequences. This paper proposes a sequence recommendation model named GAT4Rec (Gated Recurrent Unit And Transformer For Recommendation), which uses a Transformer layer that shares parameters across layers to model the user’s historical interaction sequence. The representation learned by the gated recurrent unit is used as a gating signal to filter out better substructures of the user sequence. The experimental results demonstrate that our proposed GAT4Rec model is superior to other models and has a higher recommendation effectiveness.https://ieeexplore.ieee.org/document/9869668/
spellingShingle Abeer Aljohani
Mohamed Ali Rakrouki
Nawaf Alharbe
Reyadh Alluhaibi
Retracted: A Self-Attention Mask Learning-Based Recommendation System
IEEE Access
title Retracted: A Self-Attention Mask Learning-Based Recommendation System
title_full Retracted: A Self-Attention Mask Learning-Based Recommendation System
title_fullStr Retracted: A Self-Attention Mask Learning-Based Recommendation System
title_full_unstemmed Retracted: A Self-Attention Mask Learning-Based Recommendation System
title_short Retracted: A Self-Attention Mask Learning-Based Recommendation System
title_sort retracted a self attention mask learning based recommendation system
url https://ieeexplore.ieee.org/document/9869668/
work_keys_str_mv AT abeeraljohani retractedaselfattentionmasklearningbasedrecommendationsystem
AT mohamedalirakrouki retractedaselfattentionmasklearningbasedrecommendationsystem
AT nawafalharbe retractedaselfattentionmasklearningbasedrecommendationsystem
AT reyadhalluhaibi retractedaselfattentionmasklearningbasedrecommendationsystem