Showing 241 - 260 results of 553 for search 'boosting parameter evaluation', query time: 0.10s Refine Results
  1. 241

    Comparison of sample preparation methods for higher heating values in various sugarcane varieties using near-infrared spectroscopy by Kantisa Phoomwarin, Khwantri Saengprachatanarug, Jetsada Posom, Seree Wongpichet, Kittipong Laloon, Arthit Phuphaphud

    Published 2025-08-01
    “…This study developed an efficient system for measuring the energy characteristics of energy canes in breeding programs using near-infrared spectroscopy with the aim of significantly improving the accuracy of selecting high-performing sugarcane clones. A key parameter for evaluating energy potential is the heating value, which is typically determined using bomb calorimetry. …”
    Get full text
    Article
  2. 242

    Alpine Meadow Fractional Vegetation Cover Estimation Using UAV-Aided Sentinel-2 Imagery by Kai Du, Yi Shao, Naixin Yao, Hongyan Yu, Shaozhong Ma, Xufeng Mao, Litao Wang, Jianjun Wang

    Published 2025-07-01
    “…The performance of these estimates was evaluated against reference FVC values derived from centimeter-level UAV data. …”
    Get full text
    Article
  3. 243

    Machine-Learning-Driven Analysis of Wear Loss and Frictional Behavior in Magnesium Hybrid Composites by Barun Haldar, Hillol Joardar, Arpan Kumar Mondal, Nashmi H. Alrasheedi, Rashid Khan, Murugesan P. Papathi

    Published 2025-05-01
    “…The performance evaluation showed that ML models effectively predicted friction behavior and wear behavior of magnesium-based hybrid composites using tribological test data. …”
    Get full text
    Article
  4. 244

    Enhanced slope stability prediction using ensemble machine learning techniques by Devendra Kumar Yadav, Swarup Chattopadhyay, Debi Prasad Tripathy, Pragyan Mishra, Pritiranjan Singh

    Published 2025-03-01
    “…We improved the slope stability prediction models through random cross-validation by selecting seven quantitative parameters based on 125 data points. From a classification model perspective, the best slope prediction accuracy (>90%) was attained by bagging with base classifier Decision Tree (DT), boosting with base classifier Random Forest (RF), and random forest with splitting criterion Gini-index. …”
    Get full text
    Article
  5. 245

    Machine-Learning-Based Optimal Feed Rate Determination in Machining: Integrating GA-Calibrated Cutting Force Modeling and Vibration Analysis by Yu-Peng Yeh, Han-Hao Tsai, Jen-Yuan Chang

    Published 2025-06-01
    “…These optimal feed rates are then used to train an Extreme Gradient Boosting (XGBoost) regression model, with Bayesian optimization employed for hyperparameter tuning. …”
    Get full text
    Article
  6. 246

    Clinical prediction of intravenous immunoglobulin-resistant Kawasaki disease based on interpretable Transformer model. by Gahao Chen, Ziwei Yang

    Published 2025-01-01
    “…Six machine learning algorithms - Random Forest (RF), AdaBoost, Light Gradient Boosting Machine (LightGBM), eXtreme Gradient Boosting (XGBoost), Categorical Boosting (CatBoost), and Tabular Prior-data Fitted Network version 2.0 (TabPFN-V2) - were implemented with five-fold cross-validation to optimize model hyperparameters. …”
    Get full text
    Article
  7. 247

    Developing an Integrative Data Intelligence Model for Construction Cost Estimation by Zainab Hasan Ali, Abbas M. Burhan, Murizah Kassim, Zainab Al-Khafaji

    Published 2022-01-01
    “…Statistical indicators and graphical methods were used to evaluate the developed models. Several input predictors were used, and XGBoost highlighted inflation as the most crucial parameter. …”
    Get full text
    Article
  8. 248

    Visceral adiposity index as a predictor of metabolic dysfunction-associated steatotic liver disease: a cross-sectional study by Tuo Zhou, Xiang Ding, Linjie Chen, Qianxiong Huang, Linfang He

    Published 2025-05-01
    “…Machine learning models demonstrated robust predictive accuracy, with random forest (AUC=0.869) and gradient boosting machine (AUC=0.868) outperforming non-invasive scores. …”
    Get full text
    Article
  9. 249

    Machine Learning-Based Prediction Performance Comparison of Marshall Stability and Flow in Asphalt Mixtures by Muhammad Farhan Zahoor, Arshad Hussain, Afaq Khattak

    Published 2025-06-01
    “…We collected data from published studies in the literature encompassing 732 data points to train and evaluate ML models. Eight key input parameters were considered for modeling. …”
    Get full text
    Article
  10. 250

    Evolutionary Game Analysis of Green Technology Innovation Behaviour for Enterprises from the Perspective of Prospect Theory by Guancen Wu, Luqi Deng, Xing Niu

    Published 2022-01-01
    “…This paper first calculates the equilibrium stability and evolutionary stability strategies of the enterprise green technology innovation system and then simulates the effect of subjective gains and losses values and other psychological parameters in the prospect editing and evaluation stage. …”
    Get full text
    Article
  11. 251

    A machine learning model with crude estimation of property strategy for performance prediction of perovskite solar cells based on process optimization by Dan Li, Ernie Che Mid, Shafriza Nisha Basah, Xiaochun Liu, Jian Tang, Hongyan Cui, Huilong Su, Qianliang Xiao, Shiyin Gong

    Published 2024-12-01
    “…However, optimizing the preparation parameters for PSCs is crucial. This study establishes a machine learning model incorporating a crude estimation of property (CEP) strategy to enhance prediction accuracy and precisely control process parameters. …”
    Get full text
    Article
  12. 252

    Separation of organic molecules from water by design of membrane using mass transfer model analysis and computational machine learning by Suranjana V. Mayani, Hessan Mohammad, Soumya V. Menon, Rishabh Thakur, Abdulqader Faris Abdulqader, S. Supriya, Prabhat Kumar Sahu, Kamal Kant Joshi

    Published 2025-07-01
    “…Utilizing a dataset of over 25,000 data points with r(m) and z(m) as inputs, four tree-based learning algorithms were employed: Decision Tree (DT), Extremely Randomized Trees (ET), Random Forest (RF), and Histogram-based Gradient Boosting Regression (HBGB). Hyper-parameter optimization was conducted using Successive Halving, a method aimed at efficiently allocating computational resources to optimize model performance. …”
    Get full text
    Article
  13. 253

    DSR-YOLO: A lightweight and efficient YOLOv8 model for enhanced pedestrian detection by Mustapha Oussouaddi, Omar Bouazizi, Aimad El mourabit, Zine el Abidine Alaoui Ismaili, Yassine Attaoui, Mohamed Chentouf

    Published 2025-01-01
    “…The WIoUv3 loss function was utilized to reduce the regression loss associated with bounding boxes, further boosting the performance. Evaluated on the CityPersons dataset, DSR-YOLO outperformed YOLOv8n with a 14.9 % increase in mAP@50 and 6.3 % increase in mAP@50:95, while maintaining competitive FLOPS, parameter counts, and inference speed.…”
    Get full text
    Article
  14. 254

    Comparative Analysis of Machine Learning Algorithms for Potential Evapotranspiration Estimation Using Limited Data at a High-Altitude Mediterranean Forest by Stefanos Stefanidis, Konstantinos Ioannou, Nikolaos Proutsos, Ilias Karmiris, Panagiotis Stefanidis

    Published 2025-07-01
    “…Accurate estimation of potential evapotranspiration (PET) is of paramount importance for water resource management, especially in Mediterranean mountainous environments that are often data-scarce and highly sensitive to climate variability. This study evaluates the performance of four machine learning (ML) regression algorithms—Support Vector Regression (SVR), Random Forest Regression (RFR), Gradient Boosting Regression (GBR), and K-Nearest Neighbors (KNN)—in predicting daily PET using limited meteorological data from a high-altitude in Central Greece. …”
    Get full text
    Article
  15. 255

    Machine learning frameworks to accurately estimate the adsorption of organic materials onto resin and biochar by Raouf Hassan, Mohammad Reza Kazemi

    Published 2025-04-01
    “…Various machine learning methods were evaluated, including Linear Regression, Ridge Regression, Lasso Regression, Elastic Net, Support Vector Regression (SVR), k-Nearest Neighbors (KNN), Decision Trees, Random Forests, Gradient Boosting Machines, Artificial Neural Networks (ANNs), Convolutional Neural Networks (CNNs), Gaussian Processes, as well as ensemble algorithms such as XGBoost, LightGBM, and CatBoost. …”
    Get full text
    Article
  16. 256

    Optimizing protein-ligand docking through machine learning: algorithm selection with AutoDock Vina by Ala’ Omar Hasan Zayed

    Published 2025-07-01
    “…However, its application in virtual screening across diverse molecular structures presents challenges, particularly in optimizing search parameters. To address these limitations, we developed a machine learning (ML) framework that automates the selection of optimal docking parameters. …”
    Get full text
    Article
  17. 257

    An Explainable Machine Learning-Based Prediction of Backbone Curves for Reduced Beam Section Connections Under Cyclic Loading by Emrah Tasdemir, Mustafa Yavuz Cetinkaya, Furkan Uysal, Samer El-Zahab

    Published 2025-06-01
    “…Additionally, Shapley values from XAI are employed to evaluate the influence of input parameters on model predictions. …”
    Get full text
    Article
  18. 258

    From data to decisions: Leveraging ML for improved river discharge forecasting in Bangladesh by Md. Abu Saleh, H.M. Rasel, Briti Ray

    Published 2024-01-01
    “…The forecast was performed from 2021 to 2030. 11 statistical parameters were considered for performance evaluation. …”
    Get full text
    Article
  19. 259

    Machine learning approaches for predicting the structural number of flexible pavements based on subgrade soil properties by Asadullah Ziar

    Published 2025-08-01
    “…Four algorithms were evaluated, including random forest, extreme gradient boosting, gradient boosting, and K nearest neighbors. …”
    Get full text
    Article
  20. 260

    Understanding the flowering process of litchi through machine learning predictive models by SU Zuanxian, NING Zhenchen, WANG Qing, CHEN Houbin

    Published 2025-05-01
    “…The six classical machine algorithms including Classified Regression Tree (CART), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), Stepwise Regression (STR) and Gradient Boosting Machine (GBM) were used for training. The algorithms (RF and STR) with the smallest Mean Absolute Error (MAE) and the highest residual error (RMSE) and the highest correlation coefficient (RP2) were selected for further parameter optimization and evaluation. …”
    Get full text
    Article