-
1
A Vegetable-Price Forecasting Method Based on Mixture of Experts
Published 2025-01-01Subjects: Get full text
Article -
2
Mixture of Expert Large Language Model for Legal Case Element Recognition
Published 2024-12-01“…This paper introduces a conversational mixture of expert element recognition LLM. The proposed model in this paper first designs specific prompts tailored to the characteristics of cases for the ChatGLM3-6B-base model. …”
Get full text
Article -
3
Gated ensemble of spatio-temporal mixture of experts for multi-task learning in ride-hailing system
Published 2024-12-01“…Therefore, a multi-task learning architecture is proposed in this study by developing gated ensemble of spatio-temporal mixture of experts network (GESME-Net) with convolutional recurrent neural network (CRNN), convolutional neural network (CNN), and recurrent neural network (RNN) for simultaneously forecasting these spatio-temporal tasks in a city as well as across different cities. …”
Get full text
Article -
4
Active reinforcement learning versus action bias and hysteresis: control with a mixture of experts and nonexperts.
Published 2024-03-01“…In light of how bias and hysteresis function as a heuristic for efficient control that adapts to uncertainty or low motivation by minimizing the cost of effort, these phenomena broaden the consilient theory of a mixture of experts to encompass a mixture of expert and nonexpert controllers of behavior.…”
Get full text
Article -
5
Leveraging Mixture of Experts and Deep Learning-Based Data Rebalancing to Improve Credit Fraud Detection
Published 2024-11-01Subjects: Get full text
Article -
6
Mixture of Experts Framework Based on Soft Actor-Critic Algorithm for Highway Decision-Making of Connected and Automated Vehicles
Published 2025-01-01“…This paper proposes a Mixture of Expert method (MoE) based on Soft Actor-Critic (SAC), where the upper-level discriminator dynamically decides whether to activate the lower-level DRL expert or the heuristic expert based on the features of the input state. …”
Get full text
Article -
7
-
8
MoE-NuSeg: Enhancing nuclei segmentation in histology images with a two-stage Mixture of Experts network
Published 2025-01-01Subjects: Get full text
Article -
9
Multimodal Gated Mixture of Experts Using Whole Slide Image and Flow Cytometry for Multiple Instance Learning Classification of Lymphoma
Published 2024-12-01Subjects: Get full text
Article -
10
Simultaneous estimation of multiple soil properties from vis-NIR spectra using a multi-gate mixture-of-experts with data augmentation
Published 2025-01-01Subjects: Get full text
Article -
11
Patient-Adaptive Beat-Wise Temporal Transformer for Atrial Fibrillation Classification in Continuous Long-Term Cardiac Monitoring
Published 2024-01-01Subjects: Get full text
Article -
12
MELD3: Integrating Multi-Task Ensemble Learning for Driver Distraction Detection
Published 2024-01-01Subjects: Get full text
Article -
13
The cognitive reality monitoring network and theories of consciousness
Published 2024-04-01“…The cognitive reality monitoring network (CRMN) model is derived from computational theories of mixture-of-experts architecture, hierarchical reinforcement learning and generative/inference computing modules, addressing all three levels of understanding. …”
Get full text
Article -
14
Research on Predicting Super-Relational Data Links for Mine Hoists Within Hyper-Relational Knowledge Graphs
Published 2024-12-01“…This paper proposes the HyLinker model, designed to improve the representation of entities and relations through modular components, including an entity neighbor aggregator, a relation qualifier aggregator, MoE-LSTM (Mixture of Experts Long Short-Term Memory), and a convolutional bidirectional interaction module. …”
Get full text
Article -
15
Enhancing depression recognition through a mixed expert model by integrating speaker-related and emotion-related features
Published 2025-02-01“…To tackle this challenge, we propose a Mixture-of-Experts (MoE) method that integrates speaker-related and emotion-related features for depression recognition. …”
Get full text
Article -
16
LoRA Fusion: Enhancing Image Generation
Published 2024-11-01“…One emerging approach constructs several LoRA modules, but more than three typically decrease the generation performance of pre-trained models. The mixture-of-experts model solves the performance issue, but LoRA modules are not combined using text prompts; hence, generating images by combining LoRA modules does not dynamically reflect the user’s desired requirements. …”
Get full text
Article