Showing 1 - 20 results of 53 for search 'Dead OR Alive Xtreme*', query time: 0.06s Refine Results
  1. 1

    Prediction of influenza A virus-human protein-protein interactions using XGBoost with continuous and discontinuous amino acids information by Binghua Li, Xin Li, Xiaoyu Li, Li Wang, Jun Lu, Jia Wang

    Published 2025-01-01
    “…After comparing different machine learning models, the eXtreme Gradient Boosting (XGBoost) model was determined as the final model for the prediction. …”
    Get full text
    Article
  2. 2

    Machine learning algorithms can predict emotional valence across ungulate vocalizations by Romain A. Lefèvre, Ciara C.R. Sypherd, Élodie F. Briefer

    Published 2025-02-01
    “…The present study used a machine learning algorithm (eXtreme Gradient Boosting [XGBoost]) to distinguish between contact calls indicating positive (pleasant) and negative (unpleasant) emotional valence, produced in various contexts by seven species of ungulates. …”
    Get full text
    Article
  3. 3

    A Comparative Study of Machine Learning Techniques for Predicting Mechanical Properties of Fused Deposition Modelling (FDM)-Based 3D-Printed Wood/PLA Biocomposite by Prashant Anerao, Atul Kulkarni, Yashwant Munde, Namrate Kharate

    Published 2025-08-01
    “…Four distinct machine learning algorithms have been selected for predictive modeling: Linear Regression, Support Vector Machine (SVM), eXtreme Gradient Boosting (XGBoost), and Adaptive Boosting (AdaBoost). …”
    Get full text
    Article
  4. 4

    Spatio-Temporal Segmented Traffic Flow Prediction with ANPRS Data Based on Improved XGBoost by Bo Sun, Tuo Sun, Pengpeng Jiao

    Published 2021-01-01
    “…Traffic prediction is highly significant for intelligent traffic systems and traffic management. eXtreme Gradient Boosting (XGBoost), a scalable tree lifting algorithm, is proposed and improved to predict more high-resolution traffic state by utilizing origin-destination (OD) relationship of segment flow data between upstream and downstream on the highway. …”
    Get full text
    Article
  5. 5

    Rapid detection of carbapenem-resistant Escherichia coli and carbapenem-resistant Klebsiella pneumoniae in positive blood cultures via MALDI-TOF MS and tree-based machine learning... by Xiaobo Xu, Zhaofeng Wang, Erjie Lu, Tao Lin, Hengchao Du, Zhongfei Li, Jiahong Ma

    Published 2025-01-01
    “…This study was based on matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS), Decision Tree (DT), Random Forest (RF), Gradient Boosting Machine (GBM), eXtreme Gradient Boosting (XGBoost), and Extremely Randomized Trees (ERT) models were constructed to classify carbapenem-resistant Escherichia coli (CREC) and carbapenem-resistant Klebsiella pneumoniae (CRKP). …”
    Get full text
    Article
  6. 6

    RCE-IFE: recursive cluster elimination with intra-cluster feature elimination by Cihan Kuzudisli, Burcu Bakir-Gungor, Bahjat Qaqish, Malik Yousef

    Published 2025-02-01
    “…Furthermore, RCE-IFE surpasses several state-of-the-art FS methods, such as Minimum Redundancy Maximum Relevance (MRMR), Fast Correlation-Based Filter (FCBF), Information Gain (IG), Conditional Mutual Information Maximization (CMIM), SelectKBest (SKB), and eXtreme Gradient Boosting (XGBoost), obtaining an average AUC of 0.76 on five gene expression datasets. …”
    Get full text
    Article
  7. 7

    Forecasting mental states in schizophrenia using digital phenotyping data. by Thierry Jean, Rose Guay Hottin, Pierre Orban

    Published 2025-02-01
    “…Besides it remains unclear which machine learning algorithm is best suited for forecast tasks, the eXtreme Gradient Boosting (XGBoost) and long short-term memory (LSTM) algorithms being 2  popular choices in digital phenotyping studies. …”
    Get full text
    Article
  8. 8

    Constructing a machine learning model for systemic infection after kidney stone surgery based on CT values by Jiaxin Li, Yao Du, Gaoming Huang, Yawei Huang, Xiaoqing Xi, Zhenfeng Ye

    Published 2025-02-01
    “…All five machine learning models demonstrated strong discrimination on the validation set (AUC: 0.690–0.858). The eXtreme Gradient Boosting (XGBoost) model was the best performer [AUC: 0.858; sensitivity: 0.877; specificity: 0.981; accuracy: 0.841; positive predictive value: 0.629; negative predictive value: 0.851]. …”
    Get full text
    Article
  9. 9
  10. 10

    Enhancing Abstractive Multi-Document Summarization with Bert2Bert Model for Indonesian Language by Aldi Fahluzi Muharam, Yana Aditia Gerhana, Dian Sa'adillah Maylawati, Muhammad Ali Ramdhani, Titik Khawa Abdul Rahman

    Published 2025-01-01
    “…This study investigates the effectiveness of the proposed Bert2Bert and Bert2Bert+Xtreme models in improving abstract multi-document summarization for the Indonesian language. …”
    Get full text
    Article
  11. 11

    Development and validation of machine learning models for MASLD: based on multiple potential screening indicators by Hao Chen, Jingjing Zhang, Xueqin Chen, Ling Luo, Wenjiao Dong, Yongjie Wang, Jiyu Zhou, Canjin Chen, Wenhao Wang, Wenbin Zhang, Zhiyi Zhang, Yongguang Cai, Danli Kong, Yuanlin Ding

    Published 2025-01-01
    “…Subsequently, the partial dependence plot(PDP) method and SHapley Additive exPlanations (SHAP) were utilized to explain the roles of important variables in the model to filter out the optimal indicators for constructing the MASLD risk model.ResultsRanking the feature importance of the Random Forest (RF) model and eXtreme Gradient Boosting (XGBoost) model constructed using all variables found that both homeostasis model assessment of insulin resistance (HOMA-IR) and triglyceride glucose-waist circumference (TyG-WC) were the first and second most important variables. …”
    Get full text
    Article
  12. 12

    Monitoring Moso bamboo (Phyllostachys pubescens) forests damage caused by Pantana phyllostachysae Chao considering phenological differences between on-year and off-year using UAV h... by Anqi He, Zhanghua Xu, Yifan Li, Bin Li, Xuying Huang, Huafeng Zhang, Xiaoyu Guo, Zenglu Li

    Published 2025-01-01
    “…We analyzed the impact of on-year and off-year phenological characteristics on the accuracy of hazard extraction and developed detection models for P. phyllostachysae hazard levels in on-year and off-year Moso bamboo using Support Vector Machine (SVM), Random Forest (RF), eXtreme Gradient Boosting (XGBoost), and one-dimensional Convolutional Neural Network (1D-CNN). …”
    Get full text
    Article
  13. 13

    A machine learning framework for short-term prediction of chronic obstructive pulmonary disease exacerbations using personal air quality monitors and lifestyle data by M. Atzeni, G. Cappon, J. K. Quint, F. Kelly, B. Barratt, M. Vettoretti

    Published 2025-01-01
    “…The framework employs (i) k-means clustering to uncover potentially distinct patient sub-types, (ii) supervised ML techniques (Logistic Regression, Random Forest, and eXtreme Gradient Boosting) to train and test predictive models for each patient sub-type and (iii) an explainable artificial intelligence technique (SHAP) to interpret the final models. …”
    Get full text
    Article
  14. 14

    Leveraging advanced deep learning and machine learning approaches for snow depth prediction using remote sensing and ground data by Haytam Elyoussfi, Abdelghani Boudhar, Salwa Belaqziz, Mostafa Bousbaa, Karima Nifa, Bouchra Bargam, Abdelghani Chehbouni

    Published 2025-02-01
    “…The models evaluated include two ML approaches: Support Vector Regression (SVR) and eXtreme Gradient Boosting (XGBoost) and four DL models: 1-Dimensional Convolutional Neural Network (1D-CNN), Long Short-Term Memory Networks (LSTM), Gated Recurrent Unit (GRU), and Bi-directional Long Short-Term Memory Network (Bi-LSTM). …”
    Get full text
    Article
  15. 15

    AICpred: Machine Learning-Based Prediction of Potential Anti-Inflammatory Compounds Targeting TLR4-MyD88 Binding Mechanism by Lucindah N. Fry-Nartey, Cyril Akafia, Ursula S. Nkonu, Spencer B. Baiden, Ignatus Nunana Dorvi, Kwasi Agyenkwa-Mawuli, Odame Agyapong, Claude Fiifi Hayford, Michael D. Wilson, Whelton A. Miller, Samuel K. Kwofie

    Published 2025-01-01
    “…Predictive models were trained using random forest, adaptive boosting (AdaBoost), eXtreme gradient boosting (XGBoost), k-nearest neighbours (KNN), and decision tree models. …”
    Get full text
    Article
  16. 16

    Automated post-run analysis of arrayed quantitative PCR amplification curves using machine learning [version 1; peer review: awaiting peer review] by David Garrett Brown, Darwin J. Operario, Lan Wang, Shanrui Wu, Daniel T. Leung, Eric R. Houpt, James A. Platts-Mills, Jie Liu, Ben J. Brintz

    Published 2025-01-01
    “…Methods We used 165,214 qPCR amplification curves from two studies to train and test two eXtreme Gradient Boosting (XGBoost) models. Previous manual analyses of the amplification curves by experts in qPCR analysis were used as the gold standard. …”
    Get full text
    Article
  17. 17

    Multiple PM Low-Cost Sensors, Multiple Seasons’ Data, and Multiple Calibration Models by S Srishti, Pratyush Agrawal, Padmavati Kulkarni, Hrishikesh Chandra Gautam, Meenakshi Kushwaha, V. Sreekanth

    Published 2023-02-01
    “…The ML models included (i) Decision Tree, (ii) Random Forest (RF), (iii) eXtreme Gradient Boosting, and (iv) Support Vector Regression (SVR). …”
    Get full text
    Article
  18. 18

    Time series forecasting of bed occupancy in mental health facilities in India using machine learning by G. Avinash, Hariom Pachori, Avinash Sharma, SukhDev Mishra

    Published 2025-01-01
    “…This study applies six machine learning models, namely Support Vector Regression, eXtreme Gradient Boosting, Random Forest, K-Nearest Neighbors, Gradient Boosting, and Decision Tree, to forecast weekly bed occupancy of the second largest mental hospital in India, using data from 2008 to 2024. …”
    Get full text
    Article
  19. 19

    Combining machine learning algorithms for bridging gaps in GRACE and GRACE Follow-On missions using ERA5-Land reanalysis by Jaydeo K. Dharpure, Ian M. Howat, Saurabh Kaushik, Bryan G. Mark

    Published 2025-06-01
    “…Unlike previous studies, we use a combination of Machine Learning (ML) methods—Random Forest (RF), Support Vector Machine (SVM), eXtreme Gradient Boosting (XGB), Deep Neural Network (DNN), and Stacked Long-Short Term Memory (SLSTM)—to identify and efficiently bridge the gap between GRACE and GFO by using the best-performing ML model to estimate TWSA at each grid cell. …”
    Get full text
    Article
  20. 20

    Exploring cement Production's role in GDP using explainable AI and sustainability analysis in Nepal by Ramhari Poudyal, Biplov Paneru, Bishwash Paneru, Tilak Giri, Bibek Paneru, Tim Reynolds, Khem Narayan Poudyal, Mohan B. Dangi

    Published 2025-06-01
    “…Utilizing regression models like Extra Trees (Extremely Randomized Trees) Regressor, CatBoost (Categorial Boosting) Regressor, and XGBoost (eXtreme Gradient Boosting) Regressor, Random Forest and Ensemble of Sparse Embedded Trees (SET) machine learning is used to examine the demand, supply, and Gross Domestic Product (GDP) performance of cement manufacturing in India which shares a common cement related infrastructure to Nepal. …”
    Get full text
    Article