Global-local graph attention with cyclic pseudo-labels for bitcoin anti-money laundering detection
Abstract This paper addresses the problem of detecting money laundering in the Bitcoin network. Money laundering is the process of handling the proceeds of crime to conceal their illegal source, these illicit transactions have complex features, similar to those of legal transactions. It is well know...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-07-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-08365-9 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract This paper addresses the problem of detecting money laundering in the Bitcoin network. Money laundering is the process of handling the proceeds of crime to conceal their illegal source, these illicit transactions have complex features, similar to those of legal transactions. It is well known that transactions can be represented as topological graph structure data, and many GCN-based methods have been developed for Anti-Money Laundering (AML) tasks. However, existing methods have not performed as well in dynamically assigning weights to neighboring nodes and extracting information from global nodes in the Bitcoin network. Therefore, we identify three major challenges: Firstly, GCNs can be misled by concealed illegal transactions due to uniform node representation weights. Secondly, current node-level GCNs cannot handle varied methods of concealing illegal transactions because they fail to extract global information. Thirdly, the costliness of data labelling necessitates the effective use of limited but rich domain-specific labelled data. To address these challenges, we propose the Transformer-enhanced Graph Attention Network (TFGAT) with a Global-Local Attention Mechanism (GLATM) that uses Transformers to extract global information and selectively focus on local information from connected nodes. Due to the limited availability of labelled data from expensive data labelling processes, we introduce a Deep Cyclic Pseudo-Label Updating Mechanism (DCPLU) to enhance data distribution and model robustness, which does not rely on manifold structure or Euclidean distance assumptions. DCPLU can enhance model performance while preserving the model’s existing parameters, enabling it to maintain its current faster response time in the application scenario. Experimental results show that our methods outperform existing models across various metrics. |
|---|---|
| ISSN: | 2045-2322 |