Fine-Tuned Transformer Models for Keyword Extraction in Skincare Recommendation Systems

The skincare industry in Indonesia is experiencing rapid growth, with projected revenues reaching nearly 40 billion rupiah by 2024 and expected to continue to increase. The large number of products in circulation makes it difficult for consumers to find products that suit their needs. In this contex...

Full description

Saved in:
Bibliographic Details
Main Authors: Ni Putu Adnya Puspita Dewi, Desy Purnami Singgih Putri, I Nyoman Prayana Trisna
Format: Article
Language:English
Published: Politeknik Negeri Batam 2025-06-01
Series:Journal of Applied Informatics and Computing
Subjects:
Online Access:https://jurnal.polibatam.ac.id/index.php/JAIC/article/view/9687
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The skincare industry in Indonesia is experiencing rapid growth, with projected revenues reaching nearly 40 billion rupiah by 2024 and expected to continue to increase. The large number of products in circulation makes it difficult for consumers to find products that suit their needs. In this context, a text-based recommendation system that utilizes advances in Natural Language Processing (NLP) technology is a promising solution. This research aims to develop a skincare product recommendation system based on user needs by applying the DistilBERT model, which is specifically fine-tuned with text in the skincare recommendation domain to perform keyword extraction. The resulting keywords are then used as parameters to provide recommendations by using co-occurrence as well as using a modification of Jaccard Similarity to assess the suitability between the content and benefits of the product and user preferences. The trained extraction model achieved the best performance with a micro F1-score of 0.96 at the token level and an exact match rate of 74.25% at the entity level. The evaluation of the recommendation system showed excellent results, with an nDCG value of 0.96 and a user satisfaction rate (CSAT) of 91.9%.
ISSN:2548-6861