-
241
Comparison of sample preparation methods for higher heating values in various sugarcane varieties using near-infrared spectroscopy
Published 2025-08-01“…This study developed an efficient system for measuring the energy characteristics of energy canes in breeding programs using near-infrared spectroscopy with the aim of significantly improving the accuracy of selecting high-performing sugarcane clones. A key parameter for evaluating energy potential is the heating value, which is typically determined using bomb calorimetry. …”
Get full text
Article -
242
Alpine Meadow Fractional Vegetation Cover Estimation Using UAV-Aided Sentinel-2 Imagery
Published 2025-07-01“…The performance of these estimates was evaluated against reference FVC values derived from centimeter-level UAV data. …”
Get full text
Article -
243
Machine-Learning-Driven Analysis of Wear Loss and Frictional Behavior in Magnesium Hybrid Composites
Published 2025-05-01“…The performance evaluation showed that ML models effectively predicted friction behavior and wear behavior of magnesium-based hybrid composites using tribological test data. …”
Get full text
Article -
244
Enhanced slope stability prediction using ensemble machine learning techniques
Published 2025-03-01“…We improved the slope stability prediction models through random cross-validation by selecting seven quantitative parameters based on 125 data points. From a classification model perspective, the best slope prediction accuracy (>90%) was attained by bagging with base classifier Decision Tree (DT), boosting with base classifier Random Forest (RF), and random forest with splitting criterion Gini-index. …”
Get full text
Article -
245
Machine-Learning-Based Optimal Feed Rate Determination in Machining: Integrating GA-Calibrated Cutting Force Modeling and Vibration Analysis
Published 2025-06-01“…These optimal feed rates are then used to train an Extreme Gradient Boosting (XGBoost) regression model, with Bayesian optimization employed for hyperparameter tuning. …”
Get full text
Article -
246
Clinical prediction of intravenous immunoglobulin-resistant Kawasaki disease based on interpretable Transformer model.
Published 2025-01-01“…Six machine learning algorithms - Random Forest (RF), AdaBoost, Light Gradient Boosting Machine (LightGBM), eXtreme Gradient Boosting (XGBoost), Categorical Boosting (CatBoost), and Tabular Prior-data Fitted Network version 2.0 (TabPFN-V2) - were implemented with five-fold cross-validation to optimize model hyperparameters. …”
Get full text
Article -
247
Developing an Integrative Data Intelligence Model for Construction Cost Estimation
Published 2022-01-01“…Statistical indicators and graphical methods were used to evaluate the developed models. Several input predictors were used, and XGBoost highlighted inflation as the most crucial parameter. …”
Get full text
Article -
248
Visceral adiposity index as a predictor of metabolic dysfunction-associated steatotic liver disease: a cross-sectional study
Published 2025-05-01“…Machine learning models demonstrated robust predictive accuracy, with random forest (AUC=0.869) and gradient boosting machine (AUC=0.868) outperforming non-invasive scores. …”
Get full text
Article -
249
Machine Learning-Based Prediction Performance Comparison of Marshall Stability and Flow in Asphalt Mixtures
Published 2025-06-01“…We collected data from published studies in the literature encompassing 732 data points to train and evaluate ML models. Eight key input parameters were considered for modeling. …”
Get full text
Article -
250
Evolutionary Game Analysis of Green Technology Innovation Behaviour for Enterprises from the Perspective of Prospect Theory
Published 2022-01-01“…This paper first calculates the equilibrium stability and evolutionary stability strategies of the enterprise green technology innovation system and then simulates the effect of subjective gains and losses values and other psychological parameters in the prospect editing and evaluation stage. …”
Get full text
Article -
251
A machine learning model with crude estimation of property strategy for performance prediction of perovskite solar cells based on process optimization
Published 2024-12-01“…However, optimizing the preparation parameters for PSCs is crucial. This study establishes a machine learning model incorporating a crude estimation of property (CEP) strategy to enhance prediction accuracy and precisely control process parameters. …”
Get full text
Article -
252
Separation of organic molecules from water by design of membrane using mass transfer model analysis and computational machine learning
Published 2025-07-01“…Utilizing a dataset of over 25,000 data points with r(m) and z(m) as inputs, four tree-based learning algorithms were employed: Decision Tree (DT), Extremely Randomized Trees (ET), Random Forest (RF), and Histogram-based Gradient Boosting Regression (HBGB). Hyper-parameter optimization was conducted using Successive Halving, a method aimed at efficiently allocating computational resources to optimize model performance. …”
Get full text
Article -
253
DSR-YOLO: A lightweight and efficient YOLOv8 model for enhanced pedestrian detection
Published 2025-01-01“…The WIoUv3 loss function was utilized to reduce the regression loss associated with bounding boxes, further boosting the performance. Evaluated on the CityPersons dataset, DSR-YOLO outperformed YOLOv8n with a 14.9 % increase in mAP@50 and 6.3 % increase in mAP@50:95, while maintaining competitive FLOPS, parameter counts, and inference speed.…”
Get full text
Article -
254
Comparative Analysis of Machine Learning Algorithms for Potential Evapotranspiration Estimation Using Limited Data at a High-Altitude Mediterranean Forest
Published 2025-07-01“…Accurate estimation of potential evapotranspiration (PET) is of paramount importance for water resource management, especially in Mediterranean mountainous environments that are often data-scarce and highly sensitive to climate variability. This study evaluates the performance of four machine learning (ML) regression algorithms—Support Vector Regression (SVR), Random Forest Regression (RFR), Gradient Boosting Regression (GBR), and K-Nearest Neighbors (KNN)—in predicting daily PET using limited meteorological data from a high-altitude in Central Greece. …”
Get full text
Article -
255
Machine learning frameworks to accurately estimate the adsorption of organic materials onto resin and biochar
Published 2025-04-01“…Various machine learning methods were evaluated, including Linear Regression, Ridge Regression, Lasso Regression, Elastic Net, Support Vector Regression (SVR), k-Nearest Neighbors (KNN), Decision Trees, Random Forests, Gradient Boosting Machines, Artificial Neural Networks (ANNs), Convolutional Neural Networks (CNNs), Gaussian Processes, as well as ensemble algorithms such as XGBoost, LightGBM, and CatBoost. …”
Get full text
Article -
256
Optimizing protein-ligand docking through machine learning: algorithm selection with AutoDock Vina
Published 2025-07-01“…However, its application in virtual screening across diverse molecular structures presents challenges, particularly in optimizing search parameters. To address these limitations, we developed a machine learning (ML) framework that automates the selection of optimal docking parameters. …”
Get full text
Article -
257
An Explainable Machine Learning-Based Prediction of Backbone Curves for Reduced Beam Section Connections Under Cyclic Loading
Published 2025-06-01“…Additionally, Shapley values from XAI are employed to evaluate the influence of input parameters on model predictions. …”
Get full text
Article -
258
From data to decisions: Leveraging ML for improved river discharge forecasting in Bangladesh
Published 2024-01-01“…The forecast was performed from 2021 to 2030. 11 statistical parameters were considered for performance evaluation. …”
Get full text
Article -
259
Machine learning approaches for predicting the structural number of flexible pavements based on subgrade soil properties
Published 2025-08-01“…Four algorithms were evaluated, including random forest, extreme gradient boosting, gradient boosting, and K nearest neighbors. …”
Get full text
Article -
260
Understanding the flowering process of litchi through machine learning predictive models
Published 2025-05-01“…The six classical machine algorithms including Classified Regression Tree (CART), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), Stepwise Regression (STR) and Gradient Boosting Machine (GBM) were used for training. The algorithms (RF and STR) with the smallest Mean Absolute Error (MAE) and the highest residual error (RMSE) and the highest correlation coefficient (RP2) were selected for further parameter optimization and evaluation. …”
Get full text
Article