Enhancing the efficiency of patent classification: a multimodal classification approach for design patents
Abstract With the rapid increase in the number of design patent applications, traditional patent classification systems encounter significant challenges in terms of both efficiency and scalability. This paper introduces a multimodal feature fusion approach that aims to improve the classification of...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-08-01
|
| Series: | Journal of King Saud University: Computer and Information Sciences |
| Subjects: | |
| Online Access: | https://doi.org/10.1007/s44443-025-00185-1 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract With the rapid increase in the number of design patent applications, traditional patent classification systems encounter significant challenges in terms of both efficiency and scalability. This paper introduces a multimodal feature fusion approach that aims to improve the classification of design patents and address the growing need for faster and more accurate patent examination processes. By extracting modality-specific features from design patent texts, images, and metadata, a multimodal representation is constructed to optimize the feature representations of each modality. This approach effectively captures the interactions among modalities, thereby increasing the expressive power of the features. Furthermore, an attention mechanism is employed to integrate these multimodal features into a unified representation, facilitating the automatic classification of design patents. The empirical results demonstrate that the proposed method significantly outperforms baseline models, achieving substantial improvements in accuracy, precision, recall, and the F1 score. This study provides an innovative solution for automating patent classification, increasing both the accuracy and efficiency of patent examination in practical applications. |
|---|---|
| ISSN: | 1319-1578 2213-1248 |