School of Computer Science and Informatics
Permanent URI for this collection
Browse
Browsing School of Computer Science and Informatics by Title
Now showing 1 - 20 of 3810
Results Per Page
Sort Options
Item Embargo 20 years of ETHICOMP: time to celebrate?(Elsevier, 2015-08-10) Stahl, Bernd Carsten, 1968-; Ess, C. M.Purpose – The purpose of this paper is to give an introduction to the special issue by providing background on the ETHICOMP conference series and a discussion of its role in the academic debate on ethics and computing. It provides the context that influenced the launch of the conference series and highlights its unique features. Finally, it provides an overview of the papers in the special issues. Design/methodology/approach – The paper combines an historical account of ETHICOMP and a review of the existing papers. Findings – ETHICOMP is one of the well-established conference series (alongside IACAP and CEPE) focused on ethical issues of information and computing. Its special features include: multidisciplinary and diversity of contributors and contributions; explicit outreach to professionals whose work is to design, build, deploy and maintain specific computing applications in the world at large; creation of knowledge that is accessible and relevant across fields and disciplines; intention of making a practical difference to development, use and policy of computing principles and artefacts; and creation of an inclusive, supportive and nurturing community across traditional knowledge silos. Originality/value – The paper is the first one to explicitly define the nature of ETHICOMP which is an important building block in the future development of the conference series and will contribute to the further self-definition of the ETHICOMP community. Keywords Ethics, Computer ethics, Computer science Paper type ViewpointItem Metadata only 2022 Index IEEE Transactions on Artificial Intelligence Vol. 3(IEEE, 2022-12-13) Aafaq, N.; Elizondo, DavidItem Metadata only 2PROM: A two-phase image retrieval optimization on dataspace using predictive modeling(IEEE, 2012) Fanzou Tchuissang, G. N.; Wang, N.; Kuicheu, N. C.; Siewe, Francois; Xu, D.Item Open Access 3D fast convex-hull-based evolutionary multiobjective optimization algorithm(Elsevier, 2018-06) Zhao, Jiaqi; Jiao, Licheng; Liu, Fang; Fernandes, Vitor Basto; Yevseyeva, Iryna; Xia, Shixong; Emmerich, Michael T. M.The receiver operating characteristic (ROC) and detection error tradeoff (DET) curves have been widely used in the machine learning community to analyze the performance of classifiers. The area (or volume) under the convex hull has been used as a scalar indicator for the performance of a set of classifiers in ROC and DET space. Recently, 3D convex-hull-based evolutionary multiobjective optimization algorithm (3DCH-EMOA) has been proposed to maximize the volume of convex hull for binary classification combined with parsimony and three-way classification problems. However, 3DCH-EMOA revealed high consumption of computational resources due to redundant convex hull calculations and a frequent execution of nondominated sorting. In this paper, we introduce incremental convex hull calculation and a fast replacement for non-dominated sorting. While achieving the same high quality results, the computational effort of 3DCH-EMOA can be reduced by orders of magnitude. The average time complexity of 3DCH-EMOA in each generation is reduced from O ( n 2 log n ) to O ( n log n ) per iteration, where n is the population size. Six test function problems are used to test the performance of the newly proposed method, and the algorithms are compared to several state-of-the-art algorithms, including NSGA-III, RVEA, etc., which were not compared to 3DCH-EMOA before. Experimental results show that the new version of the algorithm (3DFCH-EMOA) can speed up 3DCH-EMOA for about 30 times for a typical population size of 300 without reducing the performance of the method. Besides, the proposed algorithm is applied for neural networks pruning, and several UCI datasets are used to test the performance.Item Open Access 3D non-invasive inspection of the skin lesions by close-range and low-cost photogrammetric techniques(International Society for Stereology & Image Analysis, 2017) Orun, A.; Goodyer, E. N.; Smith, GeoffIn dermatology, one of the most common causes of skin abnormality is an unusual change in skin lesion structure which may exhibit very subtle physical deformation of its 3D shape. However the geometrical sensitivity of current cost-effective inspection and measurement methods may not be sufficient to detect such small progressive changes in skin lesion structure at micro-scale. Our proposed method could provide a low-cost, non-invasive solution by a compact system solution to overcome these shortcomings by using close-range photogrammetric imaging techniques to build a 3D surface model for a continuous observation of subtle changes in skin lesions and other features.Item Embargo 3D Object Reconstruction with Deep Learning(Springer, 2024-05-06) Aremu, Stephen S.; Taherkhani, Aboozar; Liu, Chang; Yang, ShengxiangRecent advancements and breakthroughs in deep learning have accelerated the rapid development in the field of computer vision. Having recorded a huge success in 2D object perception and detection, a lot of progress has also been made in 3D object reconstruction. Since humans can infer and relate better with 3D world images by just a single view 2D image of the object, it is necessary to train computers to think in 3D to achieve some key applications of computer vision. The use of deep learning in 3D object reconstruction of single-view images is rapidly evolving and recording significant results. In this research, we explore the Facebook well-known hybrid approach called Mesh R-CNN that combines voxel generation and triangular mesh re-construction to generate 3D mesh structure of an object from a 2D single-view image. Although the reconstruction of objects with varying geometry and topology was achieved by Mesh R-CNN, the mesh quality was affected due to topological errors like self-intersection, causing non-smooth and rough mesh generation. In this research, Mesh R-CNN with Laplacian Smoothing (Mesh R-CNN-LS) was proposed to use the Laplacian smoothing and regularization algorithm to refine the non-smooth and rough mesh. The proposed Mesh R-CNN-LS helps to constrain the triangular deformation and generate a better and smoother 3D mesh. The proposed Mesh R-CNN-LS was compared with the original Mesh R-CNN on the Pix3D dataset and it showed better performance in terms of the loss and average precision score.Item Metadata only 3D Sound Simulation over Headphones(Information Science Reference, 2009) Picinali, LorenzoItem Embargo 3D-MSFC: A 3D multi-scale features compression method for object detection(Elsevier, 2024-11-17) Li, Zhengxin; Tian, Chongzhen; Yuan, Hui; Lu, Xin; Malekmohamadi, HosseinAs machine vision tasks rapidly evolve, a new concept of compression, namely video coding for machines (VCM), has emerged. However, current VCM methods are only suitable for 2D machine vision tasks. With the popularization of autonomous driving, the demand for 3D machine vision tasks has significantly increased, leading to an explosive growth in LiDAR data that requires efficient transmission. To address this need, we propose a machine vision-based point cloud coding paradigm inspired by VCM. Specifically, we introduce a 3D multi-scale features compression (3D-MSFC) method, tailored for 3D object detection. Experimental results demonstrate that 3D-MSFC achieves less than a 3% degradation in object detection accuracy at a compression ratio of 2796×. Furthermore, its low-profile variant, 3D-MSFC-L, achieves less than a 2% degradation in accuracy at a compression ratio of 463×. The above results indicate that our proposed method can provide an ultra-high compression ratio while ensuring no significant drop in accuracy, greatly reducing the amount of data required for transmission during each detection. This can significantly lower bandwidth consumption and save substantial costs in application scenarios such as smart cities.Item Embargo A 6(4) optimized embedded Runge–Kutta–Nyström pair for the numerical solution of periodic problems(Elsevier, 2014-07-30) Anastassi, Zacharias; Kosti, Athinoula A.In this paper an optimization of the non-FSAL embedded RKN 6(4) pair with six stages of Moawwad El-Mikkawy, El-Desouky Rahmo is presented. The new method is derived after applying phase-fitting and amplification-fitting and has variable coefficients. The preservation of the algebraic order is verified and the principal term of the local truncation error is evaluated. Furthermore, periodicity analysis is performed, which reveals that the new method is ‘‘almost’’ P-stable. The efficiency of the new method is measured via the integration of several initial value problems.Item Open Access 9 Squares: Framing Data Privacy Issues(Winchester University Press, 2017-04) Boiten, Eerke AlbertIn order to frame discussions on data privacy in varied contexts, this paper introduces a categorisation of personal data along two dimensions. Each of the nine resulting categories offers a significantly different flavour of issues in data privacy. Some issues can also be perceived as a tension along a boundary between different categories. The first dimension is data ownership: who holds or publishes the data. The three possibilities are “me”, i.e. the data subject; “us”, where the data subject is part of a community; and “them”, where the data subject is indeed a subject only. The middle category contains social networks as the most interesting instance. The amount of control for the data subject moves from complete control in the “me” category to very little at all in the “them” square – but the other dimension also plays a role in that. The second dimension has three possibilities, too, focusing on the type of personal data recorded: “attributes” are what would traditionally be found in databases, and what one might think of first for “data protection”. The second type of data is “stories”, which is personal data (explicitly) produced by the data subjects, such as emails, pictures, and social network posts. The final type is “behaviours”, which is (implicitly) generated personal data, such as locations and browsing histories. The data subject has very little control over this data, even in the “us” category. This lack of control, which is closely related to the business models of the “us” category, is likely the major data privacy problem of our time.Item Open Access A bi-objective low-carbon economic scheduling method for cogeneration system considering carbon capture and demand response(Elsevier, 2023-12-14) Pang, Xinfu; Wang, Yibao; Yang, Shengxiang; Cai, Lei; Yu, YangCarbon capture and storage (CCS), energy storage (ES), and demand response (DR) mechanisms are introduced into a cogeneration system to enhance their ability to absorb wind energy, reduce carbon emissions, and improve operational efficiency. First, a bi-objective low-carbon economic scheduling model of a cogeneration system considering CCS, ES, and DR was developed. In this model, the ES and CCS remove the coupling between power generation and heating. The DR mechanism, which is based on the time-of-use electricity price and heating comfort, further enhanced the flexibility of the system. In addition, an improved bare-bones multi-objective particle swarm optimisation (IBBMOPSO) was designed to directly obtain the Pareto front of the low-carbon economy scheduling model. The particle position update mode was improved to balance global and local search capabilities in various search stages. The Taguchi method was used to calibrate the algorithm parameters. The inverse generational distance (IGD), hypervolume (HV), and maximum spread (MS) were used to evaluate the distribution and convergence performance of the algorithm. The improved technique for order preference by similarity to an ideal solution (TOPSIS) method was utilised to obtain the optimal compromise solution. Finally, the proposed method was tested on a cogeneration system in Northeast China. According to the comparison results, the average economic cost of the cogeneration system considering CCS, ES, and DR was reduced by approximately 1.13%, and carbon emissions were reduced by 6.79%. The IBBMOPSO is more competitive than the NSGA-II, MOWDO, MOMA, MOPSO, and BBMOPSO in low-carbon economic scheduling for the cogeneration system.Item Open Access A bilateral negotiation mechanism by dynamic harmony threshold for group consensus decision making(Elsevier, 2024-03-16) Cao, Mingshuo; Chiclana, Francisco; Liu, Yujia; Wu, Jian; Herrera-Viedma, EnriqueThis article proposes a framework for bilateral negotiation mechanism to deal the case the concordant decision-makers (DMs) coalition cannot be constructed, which resolves the limitations of the existing group decision making methods. Bilateral negotiation means a process in which any two involved DMs change their own opinions based on each other’s opinions, avoiding the formation of group coalitions and the coercion of individual DMs. It can not only improve group consensus by interaction between individual DMs, but also considers the limited compromise behavior of DMs in the consensus bargaining process. The key contributions of this article contain: (1) It investigates the concept of ‘harmony threshold’ by combining the consensus levels of individual DMs and the number of group members to explain the limited compromise behavior of DMs. (2) it proposes a novel bilateral negotiation consensus mechanism with personalized compromise behavior with the group consensus threshold as the objective function and personalized harmony thresholds as constraints to help any two discording DMs partly to adopt each other’s opinions. And (3) It develops the ranking difference level (RDL) to measure the deviation degree between the final ranking of alternatives and all the DMs’ original rankings of alternatives. The research found that the proposed mechanism can reduce consensus cost by 40% and ranking difference by 5%.Item Open Access A cluster prediction strategy with the induced mutation for dynamic multi-objective optimization(Elsevier, 2024-01-25) Xu, Kangyu; Xia, Yizhang; Zou, Juan; Hou, Zhanglu; Yang, Shengxiang; Hu, Yaru; Liu, YuanDynamic multi-objective optimization problems (DMOPs) are multi-objective optimization problems in which at least one objective and/or related parameter vary over time. The challenge of solving DMOPs is to efficiently and accurately track the true Pareto-optimal set when the environment undergoes changes. However, many existing prediction-based methods overlook the distinct individual movement directions and the available information in the objective space, leading to biased predictions and misleading the subsequent search process. To address this issue, this paper proposes a prediction method called IMDMOEA, which relies on cluster center points and induced mutation. Specifically, employing linear prediction methods based on cluster center points in the decision space enables the algorithm to rapidly capture the population's evolutionary direction and distributional shape. Additionally, to enhance the algorithm's adaptability to significant environmental changes, the induced mutation strategy corrects the population's evolutionary direction by selecting promising individuals for mutation based on the predicted result of the Pareto front in the objective space. These two complementary strategies enable the algorithm to respond faster and more effectively to environmental changes. Finally, the proposed algorithm is evaluated using the JY, dMOP, FDA, and F test suites. The experimental results demonstrate that IMDMOEA competes favorably with other state-of-the-art algorithms.Item Open Access A coevolutionary algorithm with detection and supervision strategy for constrained multiobjective optimization(IEEE, 2024-06-19) Feng, Jian; Liu, Shaoning; Yang, Shengxiang; Zheng, Jun; Xiao, QiBalancing objectives and constraints is challenging in addressing constrained multiobjective optimization problems (CMOPs). Existing methods may have limitations in handling various CMOPs due to the complex geometries of the Pareto front (PF). And the complexity arises from the constraints that narrow the feasible region. Categorizing problems based on their geometric characteristics facilitates facing this challenge. For this purpose, this article proposes a novel constrained multiobjective optimization framework with detection and supervision phases, called COEA-DAS. The framework categorizes the problems into four types based on the overlap between the obtained approximate unconstrained PF and constrained PF to guide the coevolution of the two populations. In the detection phase, the detection population approaches the unconstrained PF ignoring the constraints. The main population is guided by the detection population to cross infeasible barriers and approximate the constrained PF. In the supervision phase, specialized evolutionary mechanisms are designed for each possible problem type. The detection population maintains evolution to assist the main population in spreading along the constrained PF. Meanwhile, the supervision strategy is conducted to reevaluate the problem types based on the evolutionary state of the populations. This idea of balancing constraints and objectives based on the type of problem provides a novel approach for more effectively addressing the CMOPs. Experimental results indicate that the proposed algorithm performs better or more competitively on 57 benchmark problems and 12 real-world CMOPs compared with eight state-of-the-art algorithms.Item Embargo A coevolutionary Q-learning-based memetic algorithm for distributed assembly heterogeneous flexible flowshop scheduling(Elsevier, 2025-05-17) Deng, Jiawen; Zhang, Jihui; Yang, ShengxiangWith the continuous development of advanced technologies, turbulent market environments and heterogeneity of customer requirements make manufacturing more complicated; therefore efficient organization of distributed production and assembly is indispensable. However, seldom research considers setup time, transportation, and learning effect simultaneously, while they frequently influence assembly efficiency. In this paper, a distributed assembly heterogeneous flexible flowshop scheduling problem with sequence-dependent setup time, transportation, and learning effect (DAHFFS-STL) is investigated. Meanwhile, timely completion is imperative for a firm corporate reputation. Afterward, a cooperative Q-learning-based memetic algorithm (CQLMA) is devised to tackle this problem to optimize the minimum total weighted earliness and tardiness (TWET). In CQLMA, first, a group of constructive heuristics is utilized to initialize the population. Second, multiple crossover and mutation operations are implemented to expand the search space and improve the global search ability. Third, tabu search is utilized to further strengthen the exploitation ability. Subsequently, the Q-learning algorithm is leveraged to dynamically select suitable operators, thereby enhancing the optimization ability of the CQLMA. Finally, exhaustively comparisons affirm that the CQLMA has achieved more significant performance in handling DAHFFS-STL.Item Open Access A Compact 3-Stage Pipelined Hardware Accelerator for Point Multiplication of Binary Elliptic Curves Over GF (2233)(IEEE, 2024-10-24) Rehman, Mujeeb Ur; Hazzazi, Mohammad Mazyad; Rashid, Muhammad; Jamal, Sajjad Shaukat; Alblehai, Fahad; Nooh, SameerThis paper presents an area-compact hardware architecture for point multiplication (PM) computation of elliptic curves over binary GF(2233) field. We have utilized two approaches with clock cycles overhead to reduce the hardware area. First, we have revisited the Montgomery PM algorithm (for hardware implementation) using a memory block 6×m in size. The second approach uses resources of square and multiplier blocks of the arithmetic unit for the computation of a modular inversion. To optimize the critical path of the design, we have used pipeline registers in the arithmetic and logic unit of the proposed PM architecture. We have also proposed re-scheduling point addition and doubling instructions for a 3-stage pipelined architecture. The implementations are provided on an ARTY-7 field-programmable gate array (FPGA) board with an XC7100TCSG324-3 package device. The proposed architecture utilizes 1076 slices, 3269 look-up tables, and 1359 flip-flops. It requires 337880 clock cycles for one PM computation and operates on a maximum of 371MHz frequency, necessitating 910.72 μs for one PM computation. The design’s power consumption is 457mW , achieving a maximum throughput of 1Kbps. The comparisons indicate that the proposed accelerator suits applications prioritizing lower hardware resource utilization over computation speed.Item Open Access A Confidence and Conflict-based Consensus Reaching Process for Large-scale Group Decision Making Problems with Intuitionistic Fuzzy Representations(IEEE, 2024-03-07) Ding, Ru-Xi; Yang, Bing; Yang, Guo-Rui; Li, Meng-Nan; Wang, Xueqing; Chiclana, FranciscoWith the development of social democracy, the public begin to participate in large-scale group decision making (LSGDM) events that have a significant impact on their personal interests. However, the participation of the public with insufficient expertise will cause much hesitancy in the evaluations of decision makers (DMs), which can be captured by intuitionistic fuzzy sets. Meanwhile, due to the increment in the number of DMs, the cost of consensus reaching processes (CRPs), which are utilized to help DMs reach a consensus, is getting higher and higher. In order to improve the efficiency of the CRP, this paper presents a confidence and conflict-based consensus reaching process (CC-CRP) for LSGDM events with intuitionistic fuzzy representations. In the proposed model, according to the hesitancy of the DMs’ intuitionistic fuzzy evaluations, an objective method is firstly developed to calculate the confidence level of DMs that does not require any extra information. Then, a three-dimension clustering method is designed by considering the type of conflict, the degree of conflict, and the confidence level of DMs. After this, an efficiency rate of modification is defined to select DMs who will be persuaded first to adjust their evaluations with recommendation plans generated by a specific optimal method. Finally, according to the clustering process results, different CC-CRP management methods will apply to DMs with different attributes. An illustrative example and several experiments are reported to provide evidence that the proposed model is feasible and effective.Item Embargo A constrained multimodal multi-objective evolutionary algorithm based on adaptive epsilon method and two-level selection(Elsevier, 2025-01-15) Wang, Fengxia; Huang, Min; Yang, Shengxiang; Wang, XingweiConstrained multimodal multi-objective optimization problems (CMMOPs) commonly arise in practical problems in which multiple Pareto optimal sets (POSs) correspond to one Pareto optimal front (POF). The existence of constraints and multimodal characteristics makes it challenging to design effective algorithms that promote diversity in the decision space and convergence in the objective space. Therefore, this paper proposes a novel constrained multimodal multi-objective evolutionary algorithm, namely CM-MOEA, to address CMMOPs. In CM-MOEA, an adaptive epsilon-constrained method is designed to utilize promising infeasible solutions, promoting exploration in the search space. Then, a diversity-based offspring generation method is performed to select diverse solutions for mutation, searching for more equivalent POSs. Furthermore, the two-level environmental selection strategy that combines local and global environmental selection is developed to guarantee diversity and convergence of solutions. Finally, we design an archive update strategy that stores well-distributed excellent solutions, which more effectively approach the true POF. The proposed CM-MOEA is compared with several state-of-the-art algorithms on 17 test problems. The experimental results demonstrate that the proposed CM-MOEA has significant advantages in solving CMMOPs.Item Metadata only A data-driven approach to student support using formative feedback and targeted interventions(Routledge, 2025-03-26) Coupland, Simon; Fahy, Conor; Stuart, Graeme; Allman, ZoeDe Montfort University (DMU) has approximately 30,000 registered students, primarily at its Leicester campus in the United Kingdom (UK) but also at campuses internationally, as well as UK-based and transnational education partners. Based in Leicester, DMU’s community was particularly hit by the impact of COVID-19, with Leicester being the first city to be placed in local lockdown, extending the period of lockdown beyond the broader national experience. The approaches described in this case study were motivated by the need to capture information about student progress in the lockdown-necessitated online environment but have been equally impactful in in-person classroom teaching. In the subject area of computer games programming (CGP) at levels 5 and 6, students are required to use theoretical underpinning to develop solutions to practical problems, often demonstrating mastery of learning through completing a single, large piece of coursework over a medium-long timeframe, usually three–five months. Through the learning and assessment journey, students plan, meet, and reprioritise a series of dynamic sub-objectives. This all takes place during weekly timetabled workshops where most of the valuable learning occurs. These are student- and assessment-centred learning environments where learners, facilitated by tutors, incrementally develop their coursework projects. These workshops are natural opportunities to monitor engagement and to provide instant, formative feedback personalised to the learner and directly related to assessment. CGP as a discipline attracts students with a wide range of learning preferences and differences; the classical approach of 1–1 in-person tutoring may not be the best approach for these students (Amoako et al., 2013). Additionally, the temporary move to online teaching necessitated by the COVID-19 pandemic meant this established approach was not possible. Continual support and feedback are critical in an online setting and facilitated through sustained interaction between tutor and learner (Gikandi et al., 2011). Maintaining this interactivity is important, and it has been observed that continual documentation and sharing of learner-created artefacts is a key feature of meaningful interactivity (Gikandi & Morrow, 2016). In response the CGP team have developed a suite of innovative tools and processes to facilitate the real-time monitoring of student progress through using digital artefacts and the metadata associated with these digital artefacts. This approach provides students with timely formative feedback at key milestones in their progress and facilitates interventions for students requiring additional support to fully engage for best attainment. This approach is grounded in constructivist theories of learning. The individual learner is at the centre of the process, and the feedback process is an iterative, continuous part of learning (Carless et al., 2011; Molloy, 2014).Item Open Access A decomposition-based evolutionary algorithm with clustering and hierarchical estimation for multi-objective fuzzy flexible jobshop scheduling(IEEE, 2024-01-26) Zhang, Xuwei; Liu, Shixin; Zhao, Ziyan; Yang, ShengxiangAs an effective approximation algorithm for multi-objective jobshop scheduling, multi-objective evolutionary algorithms (MOEAs) have received extensive attention. However, maintaining a balance between the diversity and convergence of non-dominated solutions while ensuring overall convergence is an open problem in the context of solving Multi-objective Fuzzy Flexible Jobshop Scheduling Problems (MFFJSPs). To address it, we propose a new MOEA named MOEA/DCH by introducing a hierarchical estimation method, a clustering-based adaptive decomposition strategy, and a heuristic-based initialization method into a basic MOEA based on decomposition. Specifically, a hierarchical estimation method balances the convergence and diversity of non-dominant solutions by integrating Pareto dominance and scalarization function information. A clustering-based adaptive decomposition strategy is constructed to enhance the population's ability to approximate a complex Pareto front. A heuristic-based initialization method is developed to provide high-quality initial solutions. The performance of MOEA/DCH is verified and compared with five competitive MOEAs on widely-tested benchmark datasets. Empirical results demonstrate the effectiveness of MOEA/DCH in balancing the diversity and convergence of non-dominated solutions while ensuring overall convergence.