School of Computer Science and Informatics
Permanent URI for this collection
Browse
Recent Submissions
Item Embargo A GRA-based heterogeneous multi-attribute group decision-making method with attribute interactions(Elsevier, 2025-01-30) Feng, Yu; Dang, Yaoguo; Wang, Junjie; Du, Junliang; Chiclana, FranciscoIn the era of VUCA (Volatility, Uncertainty, Complexity, Ambiguity), multi-attribute group decision-making (MAGDM) problems face the challenges of heterogeneous uncertainty in decision information and complex interactions between attributes, which greatly affect the reliability of decision-making outcomes. To address these challenges, this paper proposes a novel heterogeneous MAGDM method based on grey relational analysis (GRA) that considers attribute interactions. First, the heterogeneous information is integrated, including crisp numbers, generalized grey numbers, intuitionistic fuzzy numbers, hesitant fuzzy numbers, and probabilistic linguistic term sets. Then, by incorporating the 2-additive Choquet integral into GRA, we establish a heterogeneous grey interactive relational model and explore its properties. Subsequently, a heterogeneous grey relational Mahalanobis-Taguchi System is designed to estimate the Shapley values of attributes. Additionally, a two-stage resolution mechanism, comprising a consensus reaching process followed by a grey relational multi-objective programming model, is devised to determine the interaction indices. Finally, the effectiveness of the proposed method is demonstrated through a case study from China’s aviation manufacturing industry, along with sensitivity analysis and comparison analyses.Item Embargo The value of expert judgments in Decision Support Systems(Elsevier, 2025-01-30) Sáenz-Royo, Carlos; Chiclana, FranciscoIt is a challenge to improve a decision support system (DSS) based on expert judgments; the literature proposes to improve accuracy and performance by increasing the sophistication and complexity of the DSS, but at what cost? This study presents a model for encoding a DSS based on expert judgments and evaluating its efficiency, establishing a three-part analysis structure: information requirements (number of judgments), quality requirements (quality assurance mechanisms), and algorithmic complexity. With a focus on the cost of judgments, a systematic and quantitative coding of the performance and cost in each part of the DSS is established. A “break-even point” efficiency measure, defined as the maximum percentage of the optimal performance that can be paid per unit of resources, is proposed to ensure that the use of the DSS remains profitable. Counterintuitively, the results of a case study show that the efficiency of DSSs does not necessarily increase with respect to the informativeness level of DSSs. Overall, this study provides a new method for evaluating the efficiency of DSSs.Item Open Access Does current AI represent a dead end?(British Computer Society/ Oxford University Press, 2024-11-06) Boiten, Eerke AlbertEerke Boiten explains his belief that current AI should not be used for serious applications.Item Metadata only Improving environmental performance of solid fuel heating systems: Reducing pollution resulting from domestic burning or agricultural practices: Phase 1 report(Innovate-UK Defra, 2024-04-20) Tiwary, AbhishekItem Open Access Enhancing Democratic Processes: A Survey of DRE, Internet, and Blockchain in Electronic Voting Systems(IEEE, 2025-01-17) Alown, Mosbah; Kiraz, Mehmet Sabir; Bingol, Muhammed AliElectronic voting (e-voting) systems have significantly improved the traditional voting process by addressing key concerns such as security, public acceptability, and convenience. However, these systems often face unique challenges, such as ensuring voter privacy and verifiability, preventing coercion and double voting, and maintaining scalability while protecting participant confidentiality. This study critically analyses and compares various e-voting schemes and technologies, evaluating their security features, verifiability mechanisms, and potential vulnerabilities. This paper reviews Direct Recording Electronic (DRE) voting, internet voting, and blockchain-based e-voting systems. In so doing, we provide an understanding of cryptographic primitives employed in e-voting systems and how they address specific characteristics and challenges associated with each voting scheme. Furthermore, we examine the applications proposed by previous studies in the context of these voting systems, assessing their strengths, limitations, and impact on democratic procedures. The cryptographic primitives reviewed include techniques like homomorphic encryption, blind signatures, and zero-knowledge proofs, which can enhance voter privacy, verifiability, and resistance to coercion and double voting.Item Embargo Personalized trust incentive mechanisms with personality characteristics for minimum cost consensus in group decision making(Elsevier, 2025-01-23) Xing, Yumei; Wu, Jian; Chiclana, Francisco; Wang, Sha; Zhu, ZhaoguangTraditional group decision making is usually to force inconsistent decision-makers (Namely, decision makers whose consensus degree does not reach a predefined level/consensus threshold.) to revise their opinions in order to improve the group consensus level. But decision-makers with conservative, neutral and radical behaviors differ in the extent to which they adjust their opinions. Hence, this paper investigates a personalized trust incentive mechanisms with personality characteristics for minimum cost consensus in group decision making, including opinion incentive and trust incentive. Firstly, the trust-driven personalized incentive mechanisms for personality characteristics, such as conservative, neutral and radical, are established to improve the adoption intention of decision makers. And then, the trust incentive evolution model with personality characteristics is established to reveal that the trust held by the remaining group members towards conservative decision makers is stable, the trust towards neutral decision makers is enhanced, and the trust towards radical decision makers is weakened. Further, the minimum cost consensus model based on personalized trust incentive mechanism with personality characteristics is constructed to generate feedback opinions of these decision makers within their respective adjustment ranges, exploring the influence law of consensus efficiency in decision makers with different personality characteristics. Finally, an illustrative example on supplier selection in the cruise ship manufacturing industry is provided to demonstrate the rationality and superiority of the proposed method.Item Open Access A Machine Learning Method on a Tiny Hardware for Monitoring and Controlling a Hydroponic System(InTech Open Journals, 2025-01-24) Sharma, Arpit; Taherkhani, Anahita; Orba, Ezekiel; Taherkhani, AboozarThe implementation of artificial intelligence on very tiny chips plays an important role in the future of IoT. Generally, these chips do not conduct artificial intelligence operations locally. They just send collected data to a cloud, where artificial intelligence is located to make the decisions. This leads to time lag and intense dependency of the system on the internet connection that making it unsuitable for systems required immediate action. In a hydroponic system, it is required to control the speed of a pump immediately to control the pH level. But there are many challenges to design the intelligent system using low-powered chips that have low computational power. Therefore, achieving high AI accuracy is very difficult for them. Additionally, the tiny devices need to communicate with the user to conduct IoT operations. To overcome these challenges, in this research a hydroponic system was designed to incorporate an ESP32 chip-based microcontroller with sensors and actuators attached to it to conduct AI on edge and IoT tasks simultaneously. A dedicated android app was implemented to monitor and control the system remotely via IoT. The results show that the predicted pump speed just falls behind the expected speed by an average of 2.94%. The overall designed system is stable and reliable. Komatsuna plants were grown in a hydroponic system and the yield was compared with the plants grown in standard potting compost. The hydroponic system was monitored by the proposed method to produce a higher yield compared to the potting compost.Item Embargo A Novel Fuzzy Logic Framework for Model Reliability Evaluation in Permeability Prediction using GPR(IEEE, 2025) Lawal, Ahmad; Yang, Yingjie; Baisa, Nathanael L.; He, HongmeiPermeability is a critical parameter in reservoir engineering and hydrocarbon extraction, yet its prediction remains challenging due to inherent uncertainties in subsurface data. While Gaussian Process Regression (GPR) has proven effective in predicting permeability with associated uncertainties, it generates multiple metrics that are difficult to interpret, particularly in high-stakes environments. This study proposes a novel approach using fuzzy logic to compute a single, comprehensive metric that accounts for model reliability. Our method incorporates human input and reasoning into the modelling process, enhancing the model’s interpretability and its ability to handle uncertainty. Additionally, we introduce a new visualization technique to simplify the understanding of fuzzy logic outputs for non-technical stakeholders. The proposed methodology demonstrates that GPR achieves a higher reliability level (0.89) compared to traditional machine learning counterparts, which are typically neutral to uncertainties. By providing a comprehensive, transparent, and easily interpretable measure of model reliability, this approach significantly aids in making more informed and responsible decisions in reservoir management. Our framework represents a crucial step towards improving the practical application of advanced machine learning techniques in the oil and gas industry, potentially extending to other fields where uncertainty quantification is vital.Item Metadata only A Hybrid Trust Service Architecture for Cloud Computing(IEEE, 2014-06-19) Yang, Zhongxue; Qin, Xiaolin; Yang, Yingjie; Yagink, TarjanaTrust service is a very important issue in cloud computing, and a cloud user needs a trust mechanism in selecting a reliable cloud service provider. Many trust technologies such as SLA, cloud audit, self-assessment questionnaire, accreditation, and so on, are proposed by some research organizations like CSA. However, all of these just provide a initial trust and have many limitations. A hybrid trust service architecture for cloud computing is proposed in this paper, which primary includes two trust modules named the initial trust module and trust-aided evaluation module. After an initial and a basic trust is established in initial trust module, the trust-aided evaluation module will be used to verify the service provider dependable further. The approaches of D-S evidence theory and Dirichlet distribution PDF are introduced to compute the trust degree value as well. The hybrid service architecture can obtain more effects on selecting the reliable service provider and promote the computing efficiency greatly.Item Metadata only Robustness of k-Anonymization Model in Compliance with General Data Protection Regulation(IEEE, 2023-03-28) Abubakar, Ibrahim Bio; Yagnik, Tarjana; Mohammed, KabiruThe advancement in technology and the emergence of big data and the internet of things (IoT), individuals (data subjects) tend to suffer from privacy breach of various types that has led to a lot of damages to both data subjects and brands. These and other issues about data privacy breach led the European Union to come up with a much stringent regulations that will serve as a deterrent to businesses or organizations that handle data. This gave birth to the General Data Protection Regulation (GDPR) in 2018 which replaced the previous 1995 Data Protection Directive in Europe. This research examined the robustness of k-anonymity in compliance with GDPR regulations at varying k-values (5,10,50, and 100) using the 1994 USA Census Bureau Data referred to as the adult dataset. Various measures were used to determine which k-value meets the GDPR criteria and the findings revealed the best anonymizing threshold complies with the GDPR criteria that prevents information loss (which determines data utility), prosecutor re-identification risk percentage and attacker models (prosecutor, journalist and marketer model).Item Embargo Multi-strategy grey wolf optimization algorithm for global optimization and engineering applications(Springer, 2024-11-06) Wang, Likai; Zhang, Qingyang; Yang, Shengxiang; Dong, YongquanThe grey wolf optimizer(GWO), a population-based meta-heuristic algorithm, mimics the predatory behavior of grey wolf packs. Continuously exploring and introducing improvement mechanisms is one of the keys to drive the development and application of GWO algorithms. To overcome the premature and stagnation of GWO, the paper proposes a multiple strategy grey wolf optimization algorithm (MSGWO). Firstly, an variable weights strategy is proposed to improve convergence rate by adjusting the weights dynamically. Secondly, this paper proposes a reverse learning strategy, which randomly reverses some individuals to improve the global search ability. Thirdly, the chain predation strategy is designed to allow the search agent to be guided by both the best individual and the previous individual. Finally, this paper proposes a rotation predation strategy, which regards the position of the current best individual as the pivot and rotate other members for enhacing the exploitation ability. To verify the performance of the proposed technique, MSGWO is compared with seven state-of-the-art meta-heuristics and four variant GWO algorithms on CEC2022 benchmark functions and three engineering optimization problems. The results demonstrate that MSGWO has better performance on most of benchmark functions and shows competitive in solving engineering design problems.Item Embargo A constrained multimodal multi-objective evolutionary algorithm based on adaptive epsilon method and two-level selection(Elsevier, 2025-01-15) Wang, Fengxia; Huang, Min; Yang, Shengxiang; Wang, XingweiConstrained multimodal multi-objective optimization problems (CMMOPs) commonly arise in practical problems in which multiple Pareto optimal sets (POSs) correspond to one Pareto optimal front (POF). The existence of constraints and multimodal characteristics makes it challenging to design effective algorithms that promote diversity in the decision space and convergence in the objective space. Therefore, this paper proposes a novel constrained multimodal multi-objective evolutionary algorithm, namely CM-MOEA, to address CMMOPs. In CM-MOEA, an adaptive epsilon-constrained method is designed to utilize promising infeasible solutions, promoting exploration in the search space. Then, a diversity-based offspring generation method is performed to select diverse solutions for mutation, searching for more equivalent POSs. Furthermore, the two-level environmental selection strategy that combines local and global environmental selection is developed to guarantee diversity and convergence of solutions. Finally, we design an archive update strategy that stores well-distributed excellent solutions, which more effectively approach the true POF. The proposed CM-MOEA is compared with several state-of-the-art algorithms on 17 test problems. The experimental results demonstrate that the proposed CM-MOEA has significant advantages in solving CMMOPs.Item Embargo A novel bi-objective R-mathematical programming method for risk group decision making(Elsevier, 2025-01-10) Tang, Guolin; Fu, Runqing; Seiti, Hamidreza; Chiclana, Francisco; Liu, PeideMost risk-based multi-attribute group decision-making (R-MAGDM) frameworks often assume that attributes are independent and rarely consider the decision-maker’s (DM) psychological behaviours. However, in many cases, attributes tend to interact with each other, and DMs often display bounded rationality during the decision-making process. A new R-mathematical programming method is developed to address these issues by integrating R-sets, regret theory, the Banzhaf function, and the LINMAP method. Initially, a novel exp operation and a method for defuzzification of R-numbers are introduced, enabling the utilisation of R-numbers in decision-making problems. Subsequently, an R-utility function and an R-regret/rejoice function are defined to calculate the Banzhaf R-perceived utility of each alternative. Following this, R-group consistency (RGCI) and inconsistency indexes (RGII) are introduced for pair-wise rankings of alternatives. Furthermore, a bi-objective R-programming model is formulated to maximise RGCI and minimise RGII to identify the R-ideal solution and optimal weights of criteria and DMs. An optimisation algorithm utilising the non-dominated sorting genetic algorithm-II (NSGA-II) is proposed to solve the constructed model and obtain the non-dominated set. Four decision-making schemes are presented to determine the best trade-off solution from this non-dominated set. Finally, a numerical case is presented to demonstrate the proposed approach’s practicality, effectiveness, and superiority.Item Embargo Enhancing train travel time prediction for China–Europe railway express: A transfer learning-based fusion technique(Elsevier, 2024-11-28) Guo, Jingwei; Guo, Jiayi; Fang, Lin; Chen, Zhen-Song; Chiclana, FranciscoAccurate train travel time (T-t) is crucial for the quality and reliability of rail transport services, particularly for China–Europe Railway Express (CRE), which occupies an important position in the global transportation network. Despite transfer learning being a useful technique to address the limited data in CRE train travel time prediction, it struggles with some insurmountable problems, such as the inability to handle seasonality and non-stationarity of data. Therefore, this paper proposes a novel fusion technique that combines transfer learning, wavelet transform, and meta-learning for predicting CRE travel time with a limited amount of sample data. Specifically, transfer learning is employed to overcome data limitations in constructing machine learning models for predicting CRE travel time. Meanwhile, a wavelet transform time series decomposition is designed to reveal hidden patterns in data and improve comprehensibility and predictability. For task decomposition, a multi-task meta-learning method is proposed that obtains the loss function gradient for each task and then updates model parameters to achieve the overall optimal structure. Lastly, a fusion technique model named WT_T.R2_MAML is developed to integrate the aforementioned functions. Through the analysis of actual operational data from the CRE trains, we have validated the successful integration of the WT_T.R2_MAML model. This achievement outlines a roadmap for the future implementation of fusion technologies.Item Embargo A dynamic trust and prospect theory driven bilateral feedback mechanism for maximizing consensus income in social network group decision making(Elsevier, 2024-12-30) Zhu, Zhaoguang; Zhang, Xiang; Cao, Mingshuo; Chiclana, Francisco; Wu, JianThis article proposes a prospect theory-based bilateral feedback mechanism with dynamic trust to reach group consensus under social network. A trust evolution model is developed by the concept of trust gap to reflect the dynamic changes in the trust relationships between DMs. The concept of a loss prospect threshold is then proposed, combining dynamic trust and consensus index, to quantitatively describe the maximum acceptable psychological loss for DMs in each round of feedback. Additionally, two indexes are defined to study feedback behavior: the improvement of consensus level as an income prospect and the preference adjustment as a loss prospect. Therefore, a bilateral feedback optimization model is constructed by maximizing the consensus income prospect under the limitation of the loss prospect threshold. To explore the role of dynamic trust and psychological behavior on the consensus-reaching process, three different feedback mechanisms are designed and compared with the proposed model, demonstrating that the proposed model can reduce preference adjustment costs and improve satisfaction with the final decision. A numerical example with sensitivity analysis of parameters is provided to illustrate the feasibility of the proposed model.Item Embargo A dynamic cost compensation mechanism driven by moderator preferences for group consensus in lending platforms(Springer, 2024-12-20) Meng, Yanli; Wang, Li; Chiclana, Francisco; Yang, Haijun; Wang, ShaThe matching service the lending platform (moderator) provides acts as a facilitative conduit for reaching a loan consensus, facilitating agreements among multiple lenders and borrowers (decision makers). In light of the reality that decision-makers exhibit varying sensitivities to compensation expectations in response to opinion adjustment, the moderator’s demonstration of a preferred compensation mechanism determines the efficiency of the matching service. This article proposes a dynamic cost compensation mechanism driven by moderator preferences for group consensus in lending platforms. Firstly, the utility function describes adjusters’ preferences, defining three unit cost compensation preferences: Power-type I, II and right-partial S-shaped preferences. Subsequently, we construct a generalized dynamic minimum-cost consensus decision model to determine the optimal unit compensation strategies within the opinion interval delineated by the moderator. For the likelihood of equitable concerns arising from fluctuations in unit compensation costs, we enforce the fairness of the compensation strategy by incorporating the Gini coefficient as a constraint within the consensus model. To validate the effectiveness and applicability of the proposed models, we apply the proposed models to online lending utilizing data obtained from an online peer-to-peer lending platform.Item Open Access A Trust Incentive Driven Feedback Mechanism With Risk Attitude for Group Consensus in Social Networks(IEEE, 2025-01-01) Ji, Feixia; Wu, Jian; Chiclana, Francisco; Sun, Qi; Herrera-Viedma, EnriqueTrust relationships can facilitate cooperation in collective decisions. Using behavioral incentives via trust to encourage voluntary preference adjustments improves consensus through mutual agreement. This article aims to establish a trust incentive-driven framework for enabling consensus in social network group decision making (SN-GDM). First, a trust incentive mechanism is modeled via interactive trust functions that integrate risk attitude. The inclusion of risk attitude is crucial as it reflects the diverse ways decision makers (DMs) respond to uncertainty in trusting others’ judgments, capturing the varied behaviors of risky, neutral, and insurance DMs in the consensus process. Inconsistent DMs then adjust opinions in exchange for heightened trust. This mechanism enhances the importance degrees via a new weight assignment method, serving as a reward to motivate DMs to further align with the majority. Subsequently, a trust incentive-driven bounded maximum consensus model is proposed to optimize cooperation dynamics while preventing over-compensation of adjustments. Simulations and comparative analysis demonstrate the model’s efficacy in facilitating cooperation through tailored trust incentive mechanisms that account for these diverse risk preferences. Finally, the approach is applied to evaluate candidates for the Norden Shipping Scholarship, providing a cooperation-focused SN-GDM framework for achieving mutually agreeable solutions while acknowledging the impact of individual risk attitude on trust-based interactions.Item Embargo A self-esteem driven feedback mechanism with diverse power structures to prevent strategic manipulation in social network group decision making(Elsevier, 2025-01-02) Sun, Qi; Zhang, Xiang; Chiclana, Francisco; Ji, Feixia; Long, Qingqi; Wu, JianIn social network group decision-making (SNGDM), the distribution of power structures and strategic manipulation behaviors pose challenges to the fairness and efficiency of the decision-making process. This paper introduces a novel consensus theoretical framework, specifically designed for analyzing power structures and preventing strategic manipulation behavior in SNGDM. It proposes a centrality measures-based influence index and a structural holes and graph density-based power index, respectively, to identify opinion leaders and power dynamics of subgroups in social trust networks. Then, a maximum entropy-based model is presented to explore power dynamics for preference aggregation in SNGDM. Furthermore, this paper introduces a feedback model based on the boundary maximum consensus degree, addressing issues that existing consensus methods tend to overlook, including the self-esteem of decision-makers and the risks of manipulation behavior. The model considers the self-esteem of subgroups when adjusting preferences, aiming to prevent potential strategic manipulation and enhance the fairness and efficiency of decision-making. Finally, thorough numerical evaluations and comparative assessments have been conducted to substantiate the effectiveness of the proposed methodology. Experiment results show that concentrated power can speed up consensus formation but may harm fairness, while dispersed power, although it slows consensus, increases participation and diversity, reducing the risk of power abuse.Item Metadata only Characterising Payload Entropy in Packet Flows—Baseline Entropy Analysis for Network Anomaly Detection(MDPI, 2024-12-16) Kenyon, Anthony; Deka, Lipika; Elizondo, DavidThe accurate and timely detection of cyber threats is critical to keeping our online economy and data safe. A key technique in early detection is the classification of unusual patterns of network behaviour, often hidden as low-frequency events within complex time-series packet flows. One of the ways in which such anomalies can be detected is to analyse the information entropy of the payload within individual packets, since changes in entropy can often indicate suspicious activity—such as whether session encryption has been compromised, or whether a plaintext channel has been co-opted as a covert channel. To decide whether activity is anomalous, we need to compare real-time entropy values with baseline values, and while the analysis of entropy in packet data is not particularly new, to the best of our knowledge, there are no published baselines for payload entropy across commonly used network services. We offer two contributions: (1) we analyse several large packet datasets to establish baseline payload information entropy values for standard network services, and (2) we present an efficient method for engineering entropy metrics from packet flows from real-time and offline packet data. Such entropy metrics can be included within feature subsets, thus making the feature set richer for subsequent analysis and machine learning applicationsItem Metadata only Natural Language Processing Tools and Workflows for Improving Research Processes(MDPI, 2024-12-16) Khan, Noel; Elizondo, David; Deka, Lipika; Molina-Cabello, MiguelThe modern research process involves refining a set of keywords until sufficiently pertinent results are obtained from acceptable sources. References and citations from the most relevant results can then be traced to related works. This process iteratively develops a set of keywords to find the most relevant literature. However, because a keyword-based search essentially samples a corpus, it may be inadequate for capturing a broad or exhaustive understanding of a topic. Further, a keyword-based search is dependent upon the underlying storage and retrieval technology and is essentially a syntactical search rather than a semantic search. To overcome such limitations, this paper explores the use of well-known natural language processing (NLP) techniques to support a semantic search and identifies where specific NLP techniques can be employed and what their primary benefits are, thus enhancing the opportunities to further improve the research process. The proposed NLP methods were tested through different workflows on different datasets and each workflow was designed to exploit latent relationships within the data to refine the keywords. The results of these tests demonstrated an improvement in the identified literature when compared to the literature extracted from the end-user-given keywords. For example, one of the defined workflows reduced the number of search results by two orders of magnitude but contained a larger percentage of pertinent results.