-
81
Artificial Intelligence in Cardiac Surgery: Transforming Outcomes and Shaping the Future
Published 2025-01-01“…Addressing limitations related to data quality, bias, validation, and regulatory frameworks is essential for its safe and effective implementation. …”
Get full text
Article -
82
Construction and Application of Digital Asset Platform for Hydropower Engineering
Published 2022-01-01“…The development of informatization and digitization of hydropower engineering is exposed to problems such as inconsistent data standards,uneven data quality,scattered data storage,weak correlation between data,and difficult data sharing.By referring to the concept of asset management,technologies including BIM,digital twins,and big data are used to construct an asset data model through the collection of engineering-related data assets and with the engineering asset codes as the link.In addition,an engineering digital asset platform is designed and developed to meet the asset management functions such as engineering data asset integration, management queries,download,and sharing,as well as the comprehensive application functions such as the overview of basic engineering information,multi-dimensional full life cycle asset management,and typical scenario application. …”
Get full text
Article -
83
Seamless Integration of RESTful Services into the Web of Data
Published 2012-01-01“…By embracing the heterogeneity, which is unavoidable at such a scale, and accepting the fact that the data quality and meaning are fuzzy, more adaptable, flexible, and extensible systems can be built. …”
Get full text
Article -
84
Regional heat and social attribute aware participant selection mechanis in mobile crowd sensing
Published 2020-02-01“…Aiming at the problem of tasks that are low reliability acquired by platform and difficult to accomplish on time in user sparse area.A participant selection mechanism that combines regional heat and social attribute aware was proposed.Firstly,considering the influence of different regional heat on task completion,the regional heat was evaluated according to the number of active users,the average residence time of users and the completion of historical task.Secondly,in order to analyze the impact of user social attributes on task completion,the user willingness,reputation and activity were calculated by combining the status information of users and the historical task record of users.Finally,by taking the above factors into account,two different mechanisms of participant selection for social attribute perception were designed for high and low heat areas to maximize quality and number oftask completionrespectively.The results show that the proposed mechanism can significantly improve the overall data quality,and can also perform sensing tasks in sparse areas on time.Meanwhile,compared with SUR and GGA-I,the failure rate is reduced by 66.7% and 50.6% respectively.…”
Get full text
Article -
85
PENERAPAN ARTIFICIAL INTELLIGENCE (AI) DALAM PERAMALAN AKUNTANSI TINJAUAN LITERATUR DAN AGENDA PENELITIAN MASA DEPAN
Published 2025-01-01“…Major implementation challenges include data quality, infrastructure limitations, and data security. …”
Get full text
Article -
86
Analyzing the Application of Machine Learning in Anemia Prediction
Published 2025-01-01“…Obstacles such as data quality, feature selection, and model interpretability continue to hinder clinical adoption. …”
Get full text
Article -
87
The effect of teacher teaching, learning methods and students’ perceptions on the student's learning achievement in Medan city
Published 2023-03-01“…Qualitative research as a human instrument, serves to determine the focus of research, select informants as data sources, assess data quality, analyze data, interpret data and draw conclusions from the findings. …”
Get full text
Article -
88
Data protection impact assessment system in the mode of risk management
Published 2018-08-01“…In the era of big data,the risk management approach has been broadly applied in the field of personal information protection.Data protection impact assessment has become an important system to promote data protection.It takes the provisions of the data protection impact assessment of the European General Data Protection Regulation (GDPR) 2016 as the sample.By using the literature research and empirical analysis method,it analyzes in depth the theoretical background,rise and evolution,meaning and scope of data protection impact assessment to establish a standardized and specific impact assessment system as well as promote personal information protection.Assessment content includes not only privacy risk assessment,but also data security,data quality and non-discrimination.Data protection impact assessment should be set as a mandatory obligation for data processing activities that are likely to result in high risks.The evaluation process shall take the advices from stakeholders to reflect their benefits.The external supervision should be strengthened and the assessment report shall be published properly.…”
Get full text
Article -
89
Geometrical Distances of Extragalactic Binaries through Spectroastrometry
Published 2025-01-01“…Within a specific range of data quality and input parameters, the distance measurement precision of individual binary star systems is generally better than 10%. …”
Get full text
Article -
90
An initial benchmark of the quality of the diagnosis and surgical treatment of breast cancer in South Africa
Published 2025-02-01“…Most quality indicators were well measurable, but data quality on reoperations and surgeon volumes was poor. …”
Get full text
Article -
91
Modeling Crossing Conflicts at Unsignalized T-Intersections under Heterogeneous Traffic Conditions
Published 2022-01-01“…However, crash-based safety assessment has known drawbacks related to data quality and coverage. Further, the crash-based safety analysis does not account that not all vehicles interact unsafely. …”
Get full text
Article -
92
Investigating the lateral resolution of the Rayleigh wave focal spot imaging technique using two-dimensional acoustic simulations
Published 2024-02-01“…The finite data range that is necessary to constrain the Bessel function model controls the lateral spreading of material contrasts, the distinction of two objects on sub-wavelength scales, and the image quality of complex random media. Good data quality from dense networks supports short range estimates and super-resolution.…”
Get full text
Article -
93
Hotel demand forecasting models and methods using artificial intelligence: A systematic literature review
Published 2024-07-01“…It addresses the gaps in the literature on AI-based demand forecasting, highlighting the need for clarity in model specification, understanding the impact of AI on pricing accuracy and financial performance, and the challenges of available data quality and computational expertise. The review concludes that AI technology can significantly improve forecasting accuracy and empower data-driven decisions in hotel management. …”
Get full text
Article -
94
Agriculture Supply Chain Management Based on Blockchain Architecture and Smart Contracts
Published 2022-01-01“…However, these data are notoriously chaotic, and analysts are concerned about their authenticity because there is a big possibility that others may have influenced data quality at various points along the data stream. …”
Get full text
Article -
95
Customer churn prediction based on the integration of meta-learning network of the forest
Published 2024-10-01“…To address the challenge of capturing temporal features in customer churn prediction tasks by tree models, a churn prediction method based on ensemble forest meta-learning network (EFML) was proposed. Firstly, data quality was improved through grouping strategies and class imbalance issues were addressed with undersampling techniques. …”
Get full text
Article -
96
Fault Diagnosis of Wind Turbine Gearbox based on LSGAN and VMD-MPE-KELM
Published 2021-11-01“…The experimental results show that LSGAN algorithm overcomes the problems of GAN gradient disappearance,unstable training and poor data quality in generating fault samples. The VMD-MPE-KPCA method can effectively extract fault features. …”
Get full text
Article -
97
SMEs probability of default: the case of the hospitality sector
Published 2015-01-01“…The main reason for our rather poor results could be from problems with the data quality, possibly since the accounts published by firms aren’t reliable and tend to present negative results.…”
Get full text
Article -
98
Network intrusion detection method based on VAE-CWGAN and fusion of statistical importance of feature
Published 2024-02-01“…Considering the problems of traditional intrusion detection methods limited by the class imbalance of datasets and the poor representation of selected features, a detection method based on VAE-CWGAN and fusion of statistical importance of features was proposed.Firstly, data preprocessing was conducted to enhance data quality.Secondly, a VAE-CWGAN model was constructed to generate new samples, addressing the problem of imbalanced datasets, ensuring that the classification model no longer biased towards the majority class.Next, standard deviation, difference of median and mean were used to rank the features and fusion their statistical importance for feature selection, aiming to obtain more representative features, which made the model can better learn data information.Finally, the mixed data set after feature selection was classified through a one-dimensional convolutional neural network.Experimental results show that the proposed method demonstrates good performance advantages on three datasets, namely NSL-KDD, UNSW-NB15, and CIC-IDS-2017.The accuracy rates are 98.95%, 96.24%, and 99.92%, respectively, effectively improving the performance of intrusion detection.…”
Get full text
Article -
99
Distributed data trading algorithm based on multi-objective utility optimization
Published 2021-02-01“…The traditional centralized data trading models are not well applicable to the current intelligent era where everything is interconnected and real-time data is generated, and in order to maximize the use of collected data, it is essential to design an effective data trading framework.Therefore, a distributed data trading framework based on consortium blockchain was proposed, which realized P2P data trading without relying on a third party.Aiming at the problem that existing data trading models only consider the factors of the data itself and ignore the factors related to user tasks, a bi-level multi-objective optimization model was constructed based on multi-dimensional factors, such as data quality, data attributes, attribute relevance and consumer competition, to optimize the utilities of data provider (DP) and data consumer (DC).To solve the above model, an improved multi-objective genetic algorithm-collaborative NSGAII was proposed, calculated by the cooperation of DP, DC and data aggregator (AG).The simulation results show that the collaborative NSGAII achieves better performance in terms of the utilities of DP and DC, thus realizing more effective data trading.…”
Get full text
Article -
100
Task security scheduling method for 5G+MEC based grid edge computing platform
Published 2022-12-01“…In order to ensure the security of task scheduling of grid edge computing platform and the data quality required by task scheduling, a task security scheduling method of grid edge computing platform based on 5G + MEC was proposed.Combined with confidentiality service and integrity service, the security level model of task scheduling was constructed to restrict the risk in the process of scheduling and transmission of scheduling task queue, so as to realize the secure transmission of 5G core network.The priority queue type was confirmd, the minimum queue and the maximum queue was selected, the maximization of data resources and the task scheduling of MEC equipment was supported, and a distributed task scheduling model was built.Using Lyapunov candidate function to improve the stability of task scheduling, and the model was solved by alternating direction multiplier method to obtain the optimal solution of task security scheduling.The test results show that after the application of this method, the risk probability results fluctuate in the range of 0.15~0.35, and the fitting degree between the relevant data provided by MEC equipment and the scheduling task of core server is higher than 0.92, the quality score of task scheduling data is also higher than 0.94.…”
Get full text
Article