Showing 121 - 140 results of 167 for search 'Extreme gradient boosting', query time: 0.09s Refine Results
  1. 121

    Development of an explainable machine learning model for predicting device-related pressure injuries in clinical settings by Yijie Qian, Hongying Pan, Jun Chen, Hongyang Hu, Mei Fang, Chen Huang, Yihong Xu, Yang Gao

    Published 2025-07-01
    “…Python was used to perform classification models, including extreme gradient boosting (XGBoost), random forest (RF), decision tree (DT), Logistic Regression (LR), support vector machine (SVM), and K-Nearest neighbors (KNN). …”
    Get full text
    Article
  2. 122

    Machine learning-driven integration of time-series InSAR and multiple surface factors for landslide identification and susceptibility assessment by Qianyu Wang, Wen Zhang, Jinglin Li, Ziyang Li, Zhi Luo, Qiang Zhao, Jianbo Jian, Fang Shangguan, Yuanxing Yang, Yangyang Ma, Zhen Zhang, Shuangming Zhao, Linyi Li, Lingkui Meng

    Published 2025-08-01
    “…Based on the delineated landslide areas, we enhanced the information value model using the inverse tangent function, which was then integrated with Random Forest and Extreme Gradient Boosting methods for landslide susceptibility assessment. …”
    Get full text
    Article
  3. 123

    Evaluation of statistical and machine learning models using satellite data to estimate aboveground biomass: A study in Vietnam Tropical Forests by Thuy Phuong Nguyen, Phuc Khoa Nguyen, Huu Ngu Nguyen, Thanh Duc Tran, Gia Tung Pham, Thai Hung Le, Dinh Huy Le, Trung Hai Nguyen, Van Binh Nguyen

    Published 2024-10-01
    “…A total of 59 input variables, including topography, texture features, and vegetation indices, from satellite data were used in four non-parametric algorithms and a conventional parametric model, Artificial Neural Networks (ANN), Support Vector Machine (SVM), Random Forest (RF), Extreme Gradient Boosting (XGBoost), and Multiple Linear Regression (MLR) to predict biomass and evaluate changes aboveground biomass over 10 years in two tropical forests in Vietnam. …”
    Get full text
    Article
  4. 124

    Exploring novel furochochicine derivatives as promising JAK2 inhibitors in HeLa cells: Integrating docking, QSAR-ML, MD simulations, and experiments by Duangjai Todsaporn, Kamonpan Sanachai, Chanat Aonbangkhen, Athina Geronikaki, Victor Kartsev, Boris Lichitsky, Andrey Komogortsev, Phornphimon Maitarad, Thanyada Rungrotmongkol

    Published 2025-01-01
    “…These cytotoxicity data were used to construct QSAR models with machine learning; eXtreme Gradient Boosting (XGB) yielded the best performance (RMSE = 0.177, R² = 0.831, MAPE = 2.93 %) and was used to predict additional FCC derivatives. …”
    Get full text
    Article
  5. 125

    Trade-off between cost and performance of earth observation data in olive trees health assessment: Digital crop mapping approach using machine learning algorithms by Yassine Bouslihim, Abdelkrim Bouasria, Aicha Rochdi, El Bachir El Haissen, Dénes Loczy, Zoltan Orban, Ali Salem

    Published 2025-08-01
    “…Furthermore, three machine learning algorithms, Random Forest (RF), Cubist, and Extreme Gradient Boosting (XGBoost), were employed to predict chlorophyll content. …”
    Get full text
    Article
  6. 126

    A new approach for monitoring spatial and temporal changes in forest types in subtropical regions with sample migration and multi-source remote sensing data by Pengfei Zheng, Dongyang Han, Jiang Liu, Bin Xu, Panfei Fang, Shaodong Huang, Wendou Liu, Shaozhi Chen

    Published 2025-08-01
    “…Subsequently, a high-dimensional feature set was developed using multi-source data, employing classifiers like random forest (RF), support vector machine (SVM), and Extreme Gradient Boosting (XGBoost) to classify forests/non-forests and five forest types, with subsequent accuracy assessment. …”
    Get full text
    Article
  7. 127

    Characteristic, relationship and impact of thermokarst lakes and retrogressive thaw slumps over the Qinghai-Tibetan plateau by Wenwen Li, Denghua Yan, Yu Lou, Baisha Weng, Lin Zhu, Yuequn Lai, Yunzhe Wang

    Published 2025-05-01
    “…We further employed the eXtreme gradient boosting algorithm and ICESat-2 ATL08 laser altimetry data to quantify changes in water storage due to TLs. …”
    Get full text
    Article
  8. 128

    Alfalfa stem count estimation using remote sensing imagery and machine learning on Google Earth Engine by Hazhir Bahrami, Karem Chokmani, Saeid Homayouni, Viacheslav I. Adamchuk, Md Saifuzzaman, Rami Albasha, Maxime Leduc

    Published 2025-08-01
    “…Three ML models—support vector machine (SVM), random forest (RF), and extreme gradient boosting (XGB)—were applied to Harmonized Landsat Sentinel (Landsat only, which is HLSL30) and Sentinel-2 datasets, accessed via the Google Earth Engine (GEE) Python API. …”
    Get full text
    Article
  9. 129

    Functional traits driving invasion risk and potential distribution of alien plants in oasis agroecosystems by Shengtianzi Dong, Shengtianzi Dong, Tiantian Qin, Tiantian Qin, Zhifang Xue, Wenchao Guo, Hanyue Wang, Hanyue Wang, Hongbin Li, Hongbin Li

    Published 2025-05-01
    “…Invasion risk was classified into four levels based on importance values. Random forest and eXtreme Gradient Boosting (XGBoost) modeling analyzed the relationship between functional traits and invasion risk, while MaxEnt modeling predicted potential distributions. …”
    Get full text
    Article
  10. 130

    Predicting cardiotoxicity in drug development: A deep learning approach by Kaifeng Liu, Huizi Cui, Xiangyu Yu, Wannan Li, Weiwei Han

    Published 2025-08-01
    “…We used four types of molecular fingerprints and descriptors combined with machine learning and deep learning algorithms, including Gaussian naive Bayes (NB), random forest (RF), support vector machine (SVM), K-nearest neighbors (KNN), eXtreme gradient boosting (XGBoost), and Transformer models, to build predictive models. …”
    Get full text
    Article
  11. 131

    The construction of HMME-PDT efficacy prediction model for port-wine stain based on machine learning algorithms by Hongxia Yan, Yixin Tan, Fan Qiao, Zhuotong Zeng, Yaqian Shi, Xueqin Zhang, Lu Li, Ting Zeng, Yi Zhan, Ruixuan You, Xinglan He, Rong Xiao, Xiangning Qiu

    Published 2025-07-01
    “…We developed and validated prediction models with Extreme Gradient Boosting (XGBoost) and Random Forest (RF) algorithms. …”
    Get full text
    Article
  12. 132

    Transfer Learning Estimation and Transferability of LNC and LMA Across Different Datasets by Yingbo Wang, Mengzhu He, Lin Sun, Yong He, Zengwei Zheng

    Published 2024-12-01
    “…The LNC and LMA estimation performance in transfer models established by partial least squares regression (PLS), support vector regression (SVR), extreme gradient boosting (XGB), and random forest regression (RFR) algorithms across different datasets were employed, in which the RFR transfer models performed good prediction results. …”
    Get full text
    Article
  13. 133

    Exploring nontoxic perovskite materials for perovskite solar cells using machine learning by W. G. A. Pabasara, H. A. H. M. Wijerathne, M. G. M. M. Karunarathne, D. M. C. Sandaru, Pradeep K. W. Abeygunawardhana, Galhenage A. Sewvandi

    Published 2025-07-01
    “…A highly accurate machine learning model was developed to predict Goldschmidt factor and the band gap, aiming to discover lead-free perovskites. Extreme Gradient Boost (XGBoost), Random Forest (RF), Gradient Boost Regression (GBR), and Ada Boost Regression (ABR) models were employed for this purpose. …”
    Get full text
    Article
  14. 134

    Predictive value of anthropometric indices for incident of dyslipidemia: a large population-based study by Somayeh Ghiasi Hafezi, Atena Ghasemabadi, Negar Soleimani, Maryam Allahyari, Mina Moradi, Amin Mansoori, Rana Kolahi Ahari, Mark Ghamsary, Gordon Ferns, Habibollah Esmaily, Majid Ghayour-Mobarhan

    Published 2025-08-01
    “…The association between these indices and dyslipidemia was assessed using logistic regression (LR), decision tree (DT), random forest (RF), neural networks (NN), K-nearest neighbors (KNN), and eXtreme Gradient Boosting (XGBoost) models. Results Based on our LR model, we found that several factors included, BAI, BSA, age, and WHR were significant. …”
    Get full text
    Article
  15. 135

    Development and validation of machine learning-based risk prediction models for ICU-acquired weakness: a prospective cohort study by Yimei Zhang, Yu Wang, Jingran Yang, Qinglan Li, Min Zhou, Jiafei Lu, Qiulan Hu, Fang Ma

    Published 2025-07-01
    “…Among the four machine-learning models, AUC ranged from 0.830 to 0.978. The eXtreme Gradient Boosting exhibited the best performance, achieving an AUC of 0.978 (95%CI 0.962–0.994), with 0.924 accuracy, 0.911 sensitivity, 0.941 specificity, 0.924 F1 score, and a Brier score of 0.084. …”
    Get full text
    Article
  16. 136

    Development and multi-cohort validation of a machine learning-based simplified frailty assessment tool for clinical risk prediction by Jiahui Lai, Cailian Cheng, Tiantian Liang, Leile Tang, Xinhua Guo, Xun Liu

    Published 2025-08-01
    “…Results Our analysis identified a minimal set of just eight readily available clinical parameters— age, sex, body mass index (BMI), pulse pressure, creatinine, hemoglobin, and preparing meals difficulty and lifting/carrying difficulty—that demonstrated robust predictive power. The extreme gradient boosting (XGBoost) algorithm exhibited superior performance across training (AUC 0.963, 95% CI: 0.951–0.975), internal validation (AUC 0.940, 95% CI: 0.924–0.956), and external validation (AUC 0.850, 95% CI: 0.832–0.868) datasets. …”
    Get full text
    Article
  17. 137

    Interpretable XGBoost model identifies idiopathic central precocious puberty in girls using four clinical and imaging features by Lu Tian, Yan Zeng, Helin Zheng, Jinhua Cai

    Published 2025-07-01
    “…The least absolute shrinkage and selection operator (LASSO) method was used to select essential characteristic parameters associated with ICPP and were used to construct logistic regression (LR) and five machine learning (ML) models, including support vector machine (SVM), Gaussian naive bayes (GaussianNB), extreme gradient boosting (XGBoost), random forest (RF), and k- nearest neighbor algorithm (kNN). …”
    Get full text
    Article
  18. 138

    Inflammatory markers mediate the association between alternative adiposity indices and mortality in patients with rheumatoid arthritis: data from NHANES 1999–2018 by Feng Luo, Jia-jie Guo, Xue-mei Yuan, Heng Zhou, Qiu-yi Wang, Chang-ming Chen, Xue-ming Yao, Wu-kai Ma

    Published 2025-05-01
    “…Threshold effects and robustness were analyzed via segmented Cox models and sensitivity analyses. Extreme gradient boosting (XGBoost) identified A Body Shape Index (ABSI) as the strongest predictor. …”
    Get full text
    Article
  19. 139

    Development of a machine learning-based prediction model for serious bacterial infections in febrile young infants by Jina Lee, Jong Seung Lee, Seak Hee Oh, Jun Sung Park, Reenar Yoo, Soo-young Lim, Dahyun Kim, Min Kyo Chun, Jeeho Han, Jeong-Yong Lee, Seung Jun Choi

    Published 2025-07-01
    “…Logistic regression (LR) and eXtreme Gradient Boosting (XGB) were used to develop the models for predicting SBIs, which were then compared with traditional rule-based models.Results The study included data from 2860 patients: 2288 (80%) in the development dataset and 572 (20%) in the validation dataset. …”
    Get full text
    Article
  20. 140

    Predicting knee osteoarthritis progression using neural network with longitudinal MRI radiomics, and biochemical biomarkers: A modeling study. by Ting Wang, Hao Liu, Wenbo Zhao, Peihua Cao, Jia Li, Tianyu Chen, Guangfeng Ruan, Yan Zhang, Xiaoshuai Wang, Qin Dang, Mengdi Zhang, Alexander Tack, David Hunter, Changhai Ding, Shengfa Li

    Published 2025-08-01
    “…JSN progression was defined as a minimum joint space width (JSW) loss of ≥0.7 mm, and pain progression as a sustained (≥2 time points) increase of ≥9 points on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain subscale (0-100 scale). Using the eXtreme Gradient BOOSTing (XGBOOST) algorithm, the model was developed in the total development cohort (n = 877) and tested in the total test cohort (n = 876). …”
    Get full text
    Article