School of Computer Science and Informatics
Permanent URI for this collection
Browse
Browsing School of Computer Science and Informatics by Type "Article"
Now showing 1 - 20 of 2014
Results Per Page
Sort Options
Item Embargo 20 years of ETHICOMP: time to celebrate?(Elsevier, 2015-08-10) Stahl, Bernd Carsten, 1968-; Ess, C. M.Purpose – The purpose of this paper is to give an introduction to the special issue by providing background on the ETHICOMP conference series and a discussion of its role in the academic debate on ethics and computing. It provides the context that influenced the launch of the conference series and highlights its unique features. Finally, it provides an overview of the papers in the special issues. Design/methodology/approach – The paper combines an historical account of ETHICOMP and a review of the existing papers. Findings – ETHICOMP is one of the well-established conference series (alongside IACAP and CEPE) focused on ethical issues of information and computing. Its special features include: multidisciplinary and diversity of contributors and contributions; explicit outreach to professionals whose work is to design, build, deploy and maintain specific computing applications in the world at large; creation of knowledge that is accessible and relevant across fields and disciplines; intention of making a practical difference to development, use and policy of computing principles and artefacts; and creation of an inclusive, supportive and nurturing community across traditional knowledge silos. Originality/value – The paper is the first one to explicitly define the nature of ETHICOMP which is an important building block in the future development of the conference series and will contribute to the further self-definition of the ETHICOMP community. Keywords Ethics, Computer ethics, Computer science Paper type ViewpointItem Metadata only 2022 Index IEEE Transactions on Artificial Intelligence Vol. 3(IEEE, 2022-12-13) Aafaq, N.; Elizondo, DavidItem Open Access 3D fast convex-hull-based evolutionary multiobjective optimization algorithm(Elsevier, 2018-06) Zhao, Jiaqi; Jiao, Licheng; Liu, Fang; Fernandes, Vitor Basto; Yevseyeva, Iryna; Xia, Shixong; Emmerich, Michael T. M.The receiver operating characteristic (ROC) and detection error tradeoff (DET) curves have been widely used in the machine learning community to analyze the performance of classifiers. The area (or volume) under the convex hull has been used as a scalar indicator for the performance of a set of classifiers in ROC and DET space. Recently, 3D convex-hull-based evolutionary multiobjective optimization algorithm (3DCH-EMOA) has been proposed to maximize the volume of convex hull for binary classification combined with parsimony and three-way classification problems. However, 3DCH-EMOA revealed high consumption of computational resources due to redundant convex hull calculations and a frequent execution of nondominated sorting. In this paper, we introduce incremental convex hull calculation and a fast replacement for non-dominated sorting. While achieving the same high quality results, the computational effort of 3DCH-EMOA can be reduced by orders of magnitude. The average time complexity of 3DCH-EMOA in each generation is reduced from O ( n 2 log n ) to O ( n log n ) per iteration, where n is the population size. Six test function problems are used to test the performance of the newly proposed method, and the algorithms are compared to several state-of-the-art algorithms, including NSGA-III, RVEA, etc., which were not compared to 3DCH-EMOA before. Experimental results show that the new version of the algorithm (3DFCH-EMOA) can speed up 3DCH-EMOA for about 30 times for a typical population size of 300 without reducing the performance of the method. Besides, the proposed algorithm is applied for neural networks pruning, and several UCI datasets are used to test the performance.Item Open Access 3D non-invasive inspection of the skin lesions by close-range and low-cost photogrammetric techniques(International Society for Stereology & Image Analysis, 2017) Orun, A.; Goodyer, E. N.; Smith, GeoffIn dermatology, one of the most common causes of skin abnormality is an unusual change in skin lesion structure which may exhibit very subtle physical deformation of its 3D shape. However the geometrical sensitivity of current cost-effective inspection and measurement methods may not be sufficient to detect such small progressive changes in skin lesion structure at micro-scale. Our proposed method could provide a low-cost, non-invasive solution by a compact system solution to overcome these shortcomings by using close-range photogrammetric imaging techniques to build a 3D surface model for a continuous observation of subtle changes in skin lesions and other features.Item Embargo 3D-MSFC: A 3D multi-scale features compression method for object detection(Elsevier, 2024-11-17) Li, Zhengxin; Tian, Chongzhen; Yuan, Hui; Lu, Xin; Malekmohamadi, HosseinAs machine vision tasks rapidly evolve, a new concept of compression, namely video coding for machines (VCM), has emerged. However, current VCM methods are only suitable for 2D machine vision tasks. With the popularization of autonomous driving, the demand for 3D machine vision tasks has significantly increased, leading to an explosive growth in LiDAR data that requires efficient transmission. To address this need, we propose a machine vision-based point cloud coding paradigm inspired by VCM. Specifically, we introduce a 3D multi-scale features compression (3D-MSFC) method, tailored for 3D object detection. Experimental results demonstrate that 3D-MSFC achieves less than a 3% degradation in object detection accuracy at a compression ratio of 2796×. Furthermore, its low-profile variant, 3D-MSFC-L, achieves less than a 2% degradation in accuracy at a compression ratio of 463×. The above results indicate that our proposed method can provide an ultra-high compression ratio while ensuring no significant drop in accuracy, greatly reducing the amount of data required for transmission during each detection. This can significantly lower bandwidth consumption and save substantial costs in application scenarios such as smart cities.Item Embargo A 6(4) optimized embedded Runge–Kutta–Nyström pair for the numerical solution of periodic problems(Elsevier, 2014-07-30) Anastassi, Zacharias; Kosti, Athinoula A.In this paper an optimization of the non-FSAL embedded RKN 6(4) pair with six stages of Moawwad El-Mikkawy, El-Desouky Rahmo is presented. The new method is derived after applying phase-fitting and amplification-fitting and has variable coefficients. The preservation of the algebraic order is verified and the principal term of the local truncation error is evaluated. Furthermore, periodicity analysis is performed, which reveals that the new method is ‘‘almost’’ P-stable. The efficiency of the new method is measured via the integration of several initial value problems.Item Open Access 9 Squares: Framing Data Privacy Issues(Winchester University Press, 2017-04) Boiten, Eerke AlbertIn order to frame discussions on data privacy in varied contexts, this paper introduces a categorisation of personal data along two dimensions. Each of the nine resulting categories offers a significantly different flavour of issues in data privacy. Some issues can also be perceived as a tension along a boundary between different categories. The first dimension is data ownership: who holds or publishes the data. The three possibilities are “me”, i.e. the data subject; “us”, where the data subject is part of a community; and “them”, where the data subject is indeed a subject only. The middle category contains social networks as the most interesting instance. The amount of control for the data subject moves from complete control in the “me” category to very little at all in the “them” square – but the other dimension also plays a role in that. The second dimension has three possibilities, too, focusing on the type of personal data recorded: “attributes” are what would traditionally be found in databases, and what one might think of first for “data protection”. The second type of data is “stories”, which is personal data (explicitly) produced by the data subjects, such as emails, pictures, and social network posts. The final type is “behaviours”, which is (implicitly) generated personal data, such as locations and browsing histories. The data subject has very little control over this data, even in the “us” category. This lack of control, which is closely related to the business models of the “us” category, is likely the major data privacy problem of our time.Item Embargo A bi-objective low-carbon economic scheduling method for cogeneration system considering carbon capture and demand response(Elsevier, 2023-12-14) Pang, Xinfu; Wang, Yibao; Yang, Shengxiang; Cai, Lei; Yu, YangCarbon capture and storage (CCS), energy storage (ES), and demand response (DR) mechanisms are introduced into a cogeneration system to enhance their ability to absorb wind energy, reduce carbon emissions, and improve operational efficiency. First, a bi-objective low-carbon economic scheduling model of a cogeneration system considering CCS, ES, and DR was developed. In this model, the ES and CCS remove the coupling between power generation and heating. The DR mechanism, which is based on the time-of-use electricity price and heating comfort, further enhanced the flexibility of the system. In addition, an improved bare-bones multi-objective particle swarm optimisation (IBBMOPSO) was designed to directly obtain the Pareto front of the low-carbon economy scheduling model. The particle position update mode was improved to balance global and local search capabilities in various search stages. The Taguchi method was used to calibrate the algorithm parameters. The inverse generational distance (IGD), hypervolume (HV), and maximum spread (MS) were used to evaluate the distribution and convergence performance of the algorithm. The improved technique for order preference by similarity to an ideal solution (TOPSIS) method was utilised to obtain the optimal compromise solution. Finally, the proposed method was tested on a cogeneration system in Northeast China. According to the comparison results, the average economic cost of the cogeneration system considering CCS, ES, and DR was reduced by approximately 1.13%, and carbon emissions were reduced by 6.79%. The IBBMOPSO is more competitive than the NSGA-II, MOWDO, MOMA, MOPSO, and BBMOPSO in low-carbon economic scheduling for the cogeneration system.Item Embargo A bilateral negotiation mechanism by dynamic harmony threshold for group consensus decision making(Elsevier, 2024-03-16) Cao, Mingshuo; Chiclana, Francisco; Liu, Yujia; Wu, Jian; Herrera-Viedma, EnriqueThis article proposes a framework for bilateral negotiation mechanism to deal the case the concordant decision-makers (DMs) coalition cannot be constructed, which resolves the limitations of the existing group decision making methods. Bilateral negotiation means a process in which any two involved DMs change their own opinions based on each other’s opinions, avoiding the formation of group coalitions and the coercion of individual DMs. It can not only improve group consensus by interaction between individual DMs, but also considers the limited compromise behavior of DMs in the consensus bargaining process. The key contributions of this article contain: (1) It investigates the concept of ‘harmony threshold’ by combining the consensus levels of individual DMs and the number of group members to explain the limited compromise behavior of DMs. (2) it proposes a novel bilateral negotiation consensus mechanism with personalized compromise behavior with the group consensus threshold as the objective function and personalized harmony thresholds as constraints to help any two discording DMs partly to adopt each other’s opinions. And (3) It develops the ranking difference level (RDL) to measure the deviation degree between the final ranking of alternatives and all the DMs’ original rankings of alternatives. The research found that the proposed mechanism can reduce consensus cost by 40% and ranking difference by 5%.Item Embargo A cluster prediction strategy with the induced mutation for dynamic multi-objective optimization(Elsevier, 2024-01-25) Xu, Kangyu; Xia, Yizhang; Zou, Juan; Hou, Zhanglu; Yang, Shengxiang; Hu, Yaru; Liu, YuanDynamic multi-objective optimization problems (DMOPs) are multi-objective optimization problems in which at least one objective and/or related parameter vary over time. The challenge of solving DMOPs is to efficiently and accurately track the true Pareto-optimal set when the environment undergoes changes. However, many existing prediction-based methods overlook the distinct individual movement directions and the available information in the objective space, leading to biased predictions and misleading the subsequent search process. To address this issue, this paper proposes a prediction method called IMDMOEA, which relies on cluster center points and induced mutation. Specifically, employing linear prediction methods based on cluster center points in the decision space enables the algorithm to rapidly capture the population's evolutionary direction and distributional shape. Additionally, to enhance the algorithm's adaptability to significant environmental changes, the induced mutation strategy corrects the population's evolutionary direction by selecting promising individuals for mutation based on the predicted result of the Pareto front in the objective space. These two complementary strategies enable the algorithm to respond faster and more effectively to environmental changes. Finally, the proposed algorithm is evaluated using the JY, dMOP, FDA, and F test suites. The experimental results demonstrate that IMDMOEA competes favorably with other state-of-the-art algorithms.Item Open Access A coevolutionary algorithm with detection and supervision strategy for constrained multiobjective optimization(IEEE, 2024-06-19) Feng, Jian; Liu, Shaoning; Yang, Shengxiang; Zheng, Jun; Xiao, QiBalancing objectives and constraints is challenging in addressing constrained multiobjective optimization problems (CMOPs). Existing methods may have limitations in handling various CMOPs due to the complex geometries of the Pareto front (PF). And the complexity arises from the constraints that narrow the feasible region. Categorizing problems based on their geometric characteristics facilitates facing this challenge. For this purpose, this article proposes a novel constrained multiobjective optimization framework with detection and supervision phases, called COEA-DAS. The framework categorizes the problems into four types based on the overlap between the obtained approximate unconstrained PF and constrained PF to guide the coevolution of the two populations. In the detection phase, the detection population approaches the unconstrained PF ignoring the constraints. The main population is guided by the detection population to cross infeasible barriers and approximate the constrained PF. In the supervision phase, specialized evolutionary mechanisms are designed for each possible problem type. The detection population maintains evolution to assist the main population in spreading along the constrained PF. Meanwhile, the supervision strategy is conducted to reevaluate the problem types based on the evolutionary state of the populations. This idea of balancing constraints and objectives based on the type of problem provides a novel approach for more effectively addressing the CMOPs. Experimental results indicate that the proposed algorithm performs better or more competitively on 57 benchmark problems and 12 real-world CMOPs compared with eight state-of-the-art algorithms.Item Open Access A Confidence and Conflict-based Consensus Reaching Process for Large-scale Group Decision Making Problems with Intuitionistic Fuzzy Representations(IEEE, 2024-03-07) Ding, Ru-Xi; Yang, Bing; Yang, Guo-Rui; Li, Meng-Nan; Wang, Xueqing; Chiclana, FranciscoWith the development of social democracy, the public begin to participate in large-scale group decision making (LSGDM) events that have a significant impact on their personal interests. However, the participation of the public with insufficient expertise will cause much hesitancy in the evaluations of decision makers (DMs), which can be captured by intuitionistic fuzzy sets. Meanwhile, due to the increment in the number of DMs, the cost of consensus reaching processes (CRPs), which are utilized to help DMs reach a consensus, is getting higher and higher. In order to improve the efficiency of the CRP, this paper presents a confidence and conflict-based consensus reaching process (CC-CRP) for LSGDM events with intuitionistic fuzzy representations. In the proposed model, according to the hesitancy of the DMs’ intuitionistic fuzzy evaluations, an objective method is firstly developed to calculate the confidence level of DMs that does not require any extra information. Then, a three-dimension clustering method is designed by considering the type of conflict, the degree of conflict, and the confidence level of DMs. After this, an efficiency rate of modification is defined to select DMs who will be persuaded first to adjust their evaluations with recommendation plans generated by a specific optimal method. Finally, according to the clustering process results, different CC-CRP management methods will apply to DMs with different attributes. An illustrative example and several experiments are reported to provide evidence that the proposed model is feasible and effective.Item Open Access A decomposition-based evolutionary algorithm with clustering and hierarchical estimation for multi-objective fuzzy flexible jobshop scheduling(IEEE, 2024-01-26) Zhang, Xuwei; Liu, Shixin; Zhao, Ziyan; Yang, ShengxiangAs an effective approximation algorithm for multi-objective jobshop scheduling, multi-objective evolutionary algorithms (MOEAs) have received extensive attention. However, maintaining a balance between the diversity and convergence of non-dominated solutions while ensuring overall convergence is an open problem in the context of solving Multi-objective Fuzzy Flexible Jobshop Scheduling Problems (MFFJSPs). To address it, we propose a new MOEA named MOEA/DCH by introducing a hierarchical estimation method, a clustering-based adaptive decomposition strategy, and a heuristic-based initialization method into a basic MOEA based on decomposition. Specifically, a hierarchical estimation method balances the convergence and diversity of non-dominant solutions by integrating Pareto dominance and scalarization function information. A clustering-based adaptive decomposition strategy is constructed to enhance the population's ability to approximate a complex Pareto front. A heuristic-based initialization method is developed to provide high-quality initial solutions. The performance of MOEA/DCH is verified and compared with five competitive MOEAs on widely-tested benchmark datasets. Empirical results demonstrate the effectiveness of MOEA/DCH in balancing the diversity and convergence of non-dominated solutions while ensuring overall convergence.Item Metadata only A dense memory representation using bitmap data structure for improving NDN push-traffic model(Springer, 2023-07-24) Sallam, Amer; Aklan, Noran; Aklan, Norhan; Rassem, Taha H.The exponential growth of the Internet demands in return new technologies and protocols that can handle the new requirements of such growth efficiently. Such developments have enabled and offered many new services with sophisticated requirements that go beyond the TCP/IP host-centric model capabilities and increase its complexity. Researchers have proposed new architecture called Named-Data Networking (NDN) for Information-Centric Networking (ICN) based on a strict pull-based model as an alternative option to TCP/IP. This model has gained significant attention in the research field. However, this model still suffers from the looped data redundancy problem, which may lead to frequent link failures when dealing with real-time streaming due to the persistent interest packets. In this paper, a push-based model along with a bitmap algorithm has been proposed for improving the ICN efficiency by eliminating such problems. The presented model involved extensive experimental simulations. The experimental results demonstrate the model feasibility by preventing most of the data redundancy and improving the harmonic rein of frequent link failures respectively.Item Embargo A dual-population coevolutionary algorithm for balancing convergence and diversity in the decision space in multimodal multi-objective optimization(Elsevier, 2024-05-31) Li, Zhipan; Rong, Huigui; Yang, Shengxiang; Yang, Xu; Huang, YupengMany multimodal multi-objective evolutionary algorithms (MMEAs) are effective in solving multimodal multi-objective problems (MMOPs), which have multiple equivalent Pareto optimal sets (PSs) mapping to the same Pareto optimal front (PF). Due to the existence of the global convergence-first mechanism, these MMEAs will remove the solutions that can improve the diversity of the decision space but have poor convergence and even lead to the loss of PS when encountering MMOPs with an imbalance between convergence and diversity in the decision space (MMOP-ICD) or an MMOP with a local PS (MMOPL). We propose a new dual-population coevolutionary algorithm to address these issues. The auxiliary population helps the main population locate areas where equivalent PSs may exist, and the main population focuses on balancing convergence and diversity in the decision space. When updating the auxiliary population, a strength local convergence quality (SLCQ) is used to explore the distribution of the equivalent PSs. When updating the main population, the new niche-based truncation strategy first deletes the solutions that contribute less to convergence. Then, a distance-based subset selection method balances the diversity between the decision and objective spaces. The comparison results show the overall performance of the proposed algorithm is significantly better than other state-of-the-art algorithms.Item Metadata only A dynamic multi-objective evolutionary algorithm based on Niche prediction strategy(Elsevier, 2023-07-10) Zheng, Jinhua; Zhang, Bo; Zou, Juan; Yang, Shengxiang; Hu, YaruIn reality, many multi-objective optimization problems are dynamic. The Pareto optimal front (PF) or Pareto optimal solution (PS) of these dynamic multi-objective problems (DMOPs) changes as the environment change. Therefore, solving such problems requires an optimization algorithm that can quickly track the PF or PS after an environment change. Prediction-based response mechanism is a common method used to deal with environmental changes, which is commonly known as center point-based prediction. However, if the predicted direction of the center point is inaccurate, the predicted population will be biased towards one side. In this paper, we propose a niche prediction strategy based on center and boundary points (PCPB) to solve the dynamic multi-objective optimization problems, which consists of three steps. After environmental changes are detected, the first step is to divide the niche, dividing different individuals in the PS into different niche populations. The second step is to independently predict different niches, and select individuals with good convergence and distribution in the niche to predict the individuals that will produce the next generation. Finally, some different individuals are randomly generated in the next possible PS area to ensure the diversity of the population. To verify whether our proposed strategy is effective and competitive, PCPB was compared with five state-of-the-art strategies. The experimental results show that PCPB performed competitively in solving dynamic multi-objective optimization problems, which proves that our algorithm has good competitiveness.Item Open Access A dynamic-niching-based Pareto domination for multimodal multiobjective optimization(IEEE, 2023-09-18) Zou, Juan; Deng, Qi; Liu, Yuan; Yang, Xinjie; Yang, Shengxiang; Zheng, JinhuaMaintaining the diversity of the decision space is of great significance in multimodal multiobjective optimization problems (MMOPs). Since the traditional Pareto-dominance-based algorithms prioritize the convergence of individuals by the Pareto-dominated sorting, it will face a phenomenon that a large number of well-distributed individuals could be dominated by other well-converged individuals during the optimization of MMOPs. To solve this problem, we propose a dynamic-niching-based Pareto domination, called DNPD, which adds a dynamic niche to constrain the tranditional Pareto dominantion to achieve a balance of convergence and diversity of population in the decision space. In the early stage of the algorithm, the smaller niche makes the algorithm retain a large number of well-distributed individuals. In the later stage of the algorithm, the dynamically increased niche accelerates the convergence of the population. DNPD can be integrated into the Pareto-dominance-based algorithms to solve MMOPs. Experimental results show that the DNPD performs well on MMF and IDMP series benchmark functions after comparing the original algorithm with the original algorithm combined with the DNPD.Item Open Access A fusion of machine learning and cryptography for fast data encryption through the encoding of high and moderate plaintext information blocks(Springer, 2024-04-04) Rehman, Mujeeb Ur; Shafique, Arslan; Mehmood, Abid; Alawida, Moatsum; Elhadef, MouradWithin the domain of image encryption, an intrinsic trade-off emerges between computational complexity and the integrity of data transmission security. Protecting digital images often requires extensive mathematical operations for robust security. However, this computational burden makes real-time applications unfeasible. The proposed research addresses this challenge by leveraging machine learning algorithms to optimize efficiency while maintaining high security. This methodology involves categorizing image pixel blocks into three classes: high-information, moderate-information, and low-information blocks using a support vector machine (SVM). Encryption is selectively applied to high and moderate information blocks, leaving low-information blocks untouched, significantly reducing computational time. To evaluate the proposed methodology, parameters like precision, recall, and F1-score are used for the machine learning component, and security is assessed using metrics like correlation, peak signal-to-noise ratio, mean square error, entropy, energy, and contrast. The results are exceptional, with accuracy, entropy, correlation, and energy values all at 97.4%, 7.9991, 0.0001, and 0.0153, respectively. Furthermore, this encryption scheme is highly efficient, completed in less than one second, as validated by a MATLAB tool. These findings emphasize the potential for efficient and secure image encryption, crucial for secure data transmission in real-time applications.Item Embargo A generalized grey model with symbolic regression algorithm and its application in predicting aircraft remaining useful life(Elsevier, 2024-07-18) Liu, Lianyi; Liu, Sifeng; Yang, Yingjie; Guo, Xiaojun; Sun, JingheAs a sparse data analysis method, a grey model faces challenges in interpretability for its effective application in uncertain systems. This study proposes a generalized grey model (GGM) based on symbolic regression, designed to improve the intelligence and adaptability of grey models. The GGM serves as a unified framework, integrating various grey model families and addresses regression challenges to determine the model structure. Symbolic regression in the GGM identifies symbolic input-output relationships, offering an interpretable approach for structure determination. By leveraging the non-uniqueness principle in grey system theory and employing structural penalty parameters, the model balances complexity and interpretability. A comparative analysis between GGM and conventional grey function models is conducted focusing on the differences in modeling, structure identification, and parameter optimization. Validation on the M3 competition dataset demonstrated the GGM's superior performance, achieving a significant reduction in prediction error compared to other grey forecasting models. Additionally, a rigorous analysis of aircraft lifespan data underscored the robustness and accuracy of GGM in practical engineering applications.Item Metadata only A Hybrid Classification and Identification of Pneumonia Using African Buffalo Optimization and CNN from Chest X-Ray Images(Tech Science Press, 2023-12-15) Alalwan, Nasser; Taloba, Ahmed I.; Abozeid, Amr; Alzahrani, Ahmed Ibrahim; Al-Bayatti, Ali HilalAn illness known as pneumonia causes inflammation in the lungs. Since there is so much information available from various X-ray images, diagnosing pneumonia has typically proven challenging. To improve image quality and speed up the diagnosis of pneumonia, numerous approaches have been devised. To date, several methods have been employed to identify pneumonia. The Convolutional Neural Network (CNN) has achieved outstanding success in identifying and diagnosing diseases in the fields of medicine and radiology. However, these methods are complex, inefficient, and imprecise to analyze a big number of datasets. In this paper, a new hybrid method for the automatic classification and identification of Pneumonia from chest X-ray images is proposed. The proposed method (ABO-CNN) utilized the African Buffalo Optimization (ABO) algorithm to enhance CNN performance and accuracy. The Weinmed filter is employed for pre-processing to eliminate unwanted noises from chest X-ray images, followed by feature extraction using the Grey Level Co-Occurrence Matrix (GLCM) approach. Relevant features are then selected from the dataset using the ABO algorithm, and ultimately, high-performance deep learning using the CNN approach is introduced for the classification and identification of Pneumonia. Experimental results on various datasets showed that, when contrasted to other approaches, the ABO-CNN outperforms them all for the classification tasks. The proposed method exhibits superior values like 96.95%, 88%, 86%, and 86% for accuracy, precision, recall, and F1-score, respectively.