Recent Submissions

  • Criminal networks analysis in missing data scenarios through graph distances

    Ficara, Annamaria; Cavallaro, Lucia; Curreri, Francesco; Fiumara, Giacomo; De Meo, Pasquale; Bagdasar, Ovidiu; Song, Wei; Liotta, Antonio; University of Palermo, Palermo, Italy; University of Messina, Messina, Italy (Public Library of Science (PLoS), 2021-08-11)
    Data collected in criminal investigations may suffer from issues like: (i) incompleteness, due to the covert nature of criminal organizations; (ii) incorrectness, caused by either unintentional data collection errors or intentional deception by criminals; (iii) inconsistency, when the same information is collected into law enforcement databases multiple times, or in different formats. In this paper we analyze nine real criminal networks of different nature (i.e., Mafia networks, criminal street gangs and terrorist organizations) in order to quantify the impact of incomplete data, and to determine which network type is most affected by it. The networks are firstly pruned using two specific methods: (i) random edge removal, simulating the scenario in which the Law Enforcement Agencies fail to intercept some calls, or to spot sporadic meetings among suspects; (ii) node removal, modeling the situation in which some suspects cannot be intercepted or investigated. Finally we compute spectral distances (i.e., Adjacency, Laplacian and normalized Laplacian Spectral Distances) and matrix distances (i.e., Root Euclidean Distance) between the complete and pruned networks, which we compare using statistical analysis. Our investigation identifies two main features: first, the overall understanding of the criminal networks remains high even with incomplete data on criminal interactions (i.e., when 10% of edges are removed); second, removing even a small fraction of suspects not investigated (i.e., 2% of nodes are removed) may lead to significant misinterpretation of the overall network.
  • A systematic literature review of machine learning applications for community-acquired pneumonia

    Lozano-Rojas, Daniel; Free, Robert C.; McEwan, Alistair A.; Woltmann, Gerrit; University of Leicester; University of Derby; University Hospitals of Leicester NHS Trust, Leicester (Springer, 2021-08-15)
    Community acquired pneumonia (CAP) is an acute respiratory disease with a high mortality rate. CAP management follows clinical and radiological diagnosis, severity evaluation and standardised treatment protocols. Although established in practice, protocols are labour intensive, time-critical and can be error prone, as their effectiveness depends on clinical expertise. Thus, an approach for capturing clinical expertise in a more analytical way is desirable both in terms of cost, expediency, and patient outcome. This paper presents a systematic literature review of Machine Learning (ML) applied to CAP. A search of three scholarly international databases revealed 23 relevant peer reviewed studies, that were categorised and evaluated relative to clinical output. Results show interest in the application of ML to CAP, particularly in image processing for diagnosis, and an opportunity for further investigation in the application of ML; both for patient outcome prediction and treatment allocation. We conclude our review by identifying potential areas for future research in applying ML to improve CAP management. This research was co-funded by the NIHR Leicester Biomedical Research Centre and the University of Leicester.
  • A hardware implementation of 6dof quadcopter matlab/simulink controller algorithm to an autopilot

    Amar, Bousbaine; Fareha, a; Joseph, A. K.; University of Derby (Institution of Engineering and Technology, 2021-01)
    This paper presents a hardware implementation of Control algorithm for 6DOF Quadcopter developed on MATLAB/SIMULINK to an autopilot Microcontroller (PIXHAWK) using MATLAB/SIMULINK Embedded Coder. After the validation of the SIMULINK model controller results through the software simulation, the designed controller is converted into C\C++ and uploaded into the Pixhawk autopilot by creating SIMULINK application in the autopilot firmware. This paper presents a rapid and real test solution for quadcopter control system using Pixhawk autopilot which will provide further real adjustment for the control parameters. This feature is used in this research is to deploy the SIMULINK codes into the Pixhawk autopilot board through the Embedded Coder Tool.
  • Simulink model for a hydrogen PEM fuel cell for automotive applications

    Bousbaine, a; Wilson, d; Andrade, j; University of Derby (Institution of Engineering and Technology, 2021-01)
    Fuel cells have a relatively high energy density and use hydrogen as a renewable energy source. Fuel cells are one of the future promising renewable and sustainable power sources that can be used as a clean power source for various applications such as transportation. In conjunction with large supercapacitors, fuel cells can generate high power density with a fast dynamic response, which is ideal for automotive applications. In order to design a highly efficient fuel cell system for automotive applications, an optimised model for a multi-level DC-DC converter, fuel cell and supercapapcitor is required. The analytical model for the fuel cell has been developed in order to model the interface of the fuel cell, supercapacitor and drive train to the interleaved DC-DC converter. This paper deals with the development of a detailed fuel cell model using Matlab/ Simulink where the parameters for a Ballard Mk-V fuel cell stack are used. The simulation results have been presented and discussed and the validity of the developed model is ascertained.
  • Internet of Planets (IoP): A New Era of the Internet

    Kang, Byungseok; Malute, Francis; Bagdasar, Ovidiu; Hong, Choongseon; University of Derby; Kyung Hee University, Seoul, South Korea (Institute of Electrical and Electronics Engineers (IEEE), 2021-06-24)
    Internet of Planets (IoP) is a concept that enables solar planets to communicate with each other using the Internet. While there is a plethora of research on IoP, the delay tolerant network (DTN) has emerged as the most advanced technology in recent years. DTN is an asynchronous networking technology that has been deployed for the networking environment in which steady communication paths are not available, and therefore, it stores receiving data in a data storage and forward them only when the communication links are established. DTN can be applied to sensor networks and the mobile ad-hoc network, as well as space communication that supports data transmissions among satellites. In DTN networking environments, it is crucial to secure a scheme that has relatively low routing overhead and high reliability to achieve efficiency. Thus, this article proposes a time (delay) information based DTN routing scheme, which is able to predict routing paths for achieving efficient data transmissions among the nodes that have comparatively periodic moving patterns. The results of the proposed DTN routing algorithm using NS-3 simulation tools indicate satisfied levels of routing performance in comparison with the existing DTN algorithm.
  • COVID-19 pandemic decision support system for a population defense strategy and vaccination effectiveness

    Varotsos, Costas A; Krapivin, Vladimir F; Xue, Yong; Soldatov, Vladimir; Voronova, Tatiana; National and Kapodistrian University of Athens, Athens, Greece; Kotelnikov’s Institute of Radioengineering and Electronics, Fryazino Branch, Russian Academy of Sciences, Vvedensky 1, Fryazino, Moscow Region 141190, Russian Federation; University of Mining and Technology, Xuzhou, Jiangsu 221116, PR China; University of Derby (Elsevier BV, 2021-06-05)
    The year 2020 ended with a significant COVID-19 pandemic, which traumatized almost many countries where the lockdowns were restored, and numerous emotional social protests erupted. According to the World Health Organization, the global epidemiological situation in the first months of 2021 deteriorated. In this paper, the decision-making supporting system (DMSS) is proposed to be an epidemiological prediction tool. COVID-19 trends in several countries and regions, take into account the big data clouds for important geophysical and socio-ecological characteristics and the expected potentials of the medical service, including vaccination and restrictions on population migration both within the country and international traffic. These parameters for numerical simulations are estimated from officially delivered data that allows the verification of theoretical results. The numerical simulations of the transition and the results of COVID-19 are mainly based on the deterministic approach and the algorithm for processing statistical data based on the instability indicator. DMSS has been shown to help predict the effects of COVID-19 depending on the protection strategies against COVID-19 including vaccination. Numerical simulations have shown that DMSS provides results using accompanying information in the appropriate scenario.
  • Regulating Product Sustainability

    Takhar, Raj; Takhar, Sukhraj; University of Derby (The Parliamentary Office of Science and Technology, 2021-06-10)
    Interview, written review and feedback on UK government proposals on the future of regulating product for sustainability.
  • A collaborative approach for national cybersecurity incident management

    Oriola, Oluwafemi; Adeyemo, Adesesan Barnabas; Papadaki, Maria; Kotzé, Eduan; university of Plymouth; University of Ibadan, Ibadan, Nigeria; University of the Free State, Bloemfontein, South Africa (Emerald, 2021-06-28)
    Collaborative-based national cybersecurity incident management benefits from the huge size of incident information, large-scale information security devices and aggregation of security skills. However, no existing collaborative approach has been able to cater for multiple regulators, divergent incident views and incident reputation trust issues that national cybersecurity incident management presents. This paper aims to propose a collaborative approach to handle these issues cost-effectively. A collaborative-based national cybersecurity incident management architecture based on ITU-T X.1056 security incident management framework is proposed. It is composed of the cooperative regulatory unit with cooperative and third-party management strategies and an execution unit, with incident handling and response strategies. Novel collaborative incident prioritization and mitigation planning models that are fit for incident handling in national cybersecurity incident management are proposed. Use case depicting how the collaborative-based national cybersecurity incident management would function within a typical information and communication technology ecosystem is illustrated. The proposed collaborative approach is evaluated based on the performances of an experimental cyber-incident management system against two multistage attack scenarios. The results show that the proposed approach is more reliable compared to the existing ones based on descriptive statistics. The approach produces better incident impact scores and rankings than standard tools. The approach reduces the total response costs by 8.33% and false positive rate by 97.20% for the first attack scenario, while it reduces the total response costs by 26.67% and false positive rate by 78.83% for the second attack scenario.
  • An empirical analysis of the information security culture key factors framework

    Tolah, Alaa; Furnell, Steven; Papadaki, Maria; University of Plymouth; Saudi Electronic University, Riyadh, Saudi Arabia; University of Nottingham; University of Derby; Nelson Mandela University, Gqeberha, South Africa (Elsevier, 2021-06-05)
    Information security is a challenge facing organisations, as security breaches pose a serious threat to sensitive information. Organisations face security risks in relation to their information assets, which may also stem from their own employees. Organisations need to focus on employee behaviour to limit security failures, as if they wish to establish effective security culture with employees acting as a natural safeguard for information assets. This study was conducted to respond to a need for more empirical studies that focus on a development of security culture to provide a comprehensive framework. The Information Security Culture and Key Factors Framework has been developed, incorporating two types of factors: those that influence security culture and those that reflect it. This paper validates the applicability of the framework and tests related hypotheses through an empirical study. An exploratory survey was conducted, and 266 valid responses were obtained. Phase two of the study demonstrates the framework levels of validity and reliability through the use of factor analysis. Different hypothetical correlations were analysed through the use of structural equation modelling, with indirect exploratory effect of the moderators achieved through a multi-group analysis. The findings show that the framework has validity and achieved an acceptable fit with the data. This study fills an important gap in the significant relationship between personality traits and security culture. It also contributes to the improvement of information security management through the introduction of a comprehensive framework in practice, which functions in the establishment of security culture. The factors are vital in justifying security culture acceptance, and the framework provides an important tool that can be used to assess and improve an organisational security culture.
  • Research on Action Strategies and Simulations of DRL and MCTS-based Intelligent Round Game

    Sun, Yuxiang; Yuan, Bo; Zhang, Yongliang; Zheng, Wanwen; Xia, Qingfeng; Tang, Bojian; Zhou, Xianzhong; Nanjing University, China; University of Derby; Army Engineering University, Nanjing, China (Springer Science and Business Media LLC, 2021-06-16)
    The reinforcement learning problem of complex action control in multiplayer online battlefield games has brought considerable interest in the deep learning field. This problem involves more complex states and action spaces than traditional confrontation games, making it difficult to search for any strategy with human-level performance. This paper presents a deep reinforcement learning model to solve this problem from the perspective of game simulations and algorithm implementation. A reverse reinforcement-learning model based on high-level player training data is established to support downstream algorithms. With less training data, the proposed model is converged quicker, and more consistent with the action strategies of high-level players’ decision-making. Then an intelligent deduction algorithm based on DDQN is developed to achieve a better generalization ability under the guidance of a given reward function. At the game simulation level, this paper constructs Monte Carlo Tree Search Intelligent Decision Model for turn-based antagonistic deduction games to generate next-step actions. Furthermore, a prototype game simulator that combines offline with online functions is implemented to verify the performance of proposed model and algorithm. The experiments show that our proposed approach not only has a better reference value to the antagonistic environment using incomplete information, but also accurate and effective in predicting the return value. Moreover, our work provides a theoretical validation platform and testbed for related research on game AI for deductive games.
  • Electro-Thermal Coupled Modeling of Induction Motor Using 2D Finite Element Method

    Bousbaine, Amar; Bouheraoua, Mustapha; Atig, M.; Benamrouche, N; University of Derby; Université Mouloud Mammeri de Tizi Ouzou (Ştefan cel Mare University of Suceava, 2021-05-31)
    The paper evaluates the thermal behavior of an induction machine based on a coupled electromagnetic-thermal model using 2D non-linear complex finite element method. The currents and the temperature distribution in a squirrel cage induction motor in transient state are investigated and presented. The convection heat transfer coefficient between the frame and ambient and the windings are treated with particular attention. The developed method can be applied to other electric machines having negligible axial heat flow. The corroboration of the theoretical/simulated results have been investigated, experimentally using a 2.2 kW totally enclosed fan-cooled induction motor. The simulated results and those obtained from measurements have been critically evaluated and showed good agreements.
  • Nowcasting of air pollution episodes in megacities: A case study for Athens, Greece

    Varotsos, Costas A.; Mazei, Yuri; Saldaev, Damir; Efstathiou, Maria; Voronova, Tatiana; Xue, Yong; University of Athens, Athens, Greece; Lomonosov Moscow State University, Leninskiye Gory, 1, Moscow, Russia; Severtsov Institute of Ecology and Evolution, Russian Academy of Sciences, Moscow, Russia; Shenzhen MSU-BIT University, Shenzhen, China; et al. (Elsevier BV, 2021-06-02)
    The main objective of the present study is to develop a model for the prediction of the extreme events of air pollution in megacities using the concept of so-called "natural time" instead of the "conventional clock time". In particular, we develop a new nowcasting technique based on a statistically significant fit to the law of Gutenberg-Richter of the surface concentration of ozone (O3), particles of the size fraction less than 10 μm (PM-10) and nitrogen dioxide (NO2). Studying the air pollution over Athens, Greece during the period 2000–2018, we found that the average waiting time between successive extreme concentrations values varied between different atmospheric parameters accounted as 17 days in case of O3, 29 days in case of PM-10 and 28 days in case of NO2. This average waiting time depends on the upper threshold of the maximum extreme concentrations of air pollutants considered. For instance, considering the NO2 concentrations over Athens it was found that the average waiting time is 13 days for 130 μg/m3 and 2.4 years for 200 μg/m3. Remarkably, the same behaviour of obedience to the Guttenberg-Richter law characterizing the extreme values of the air pollution of a megacity was found earlier in other long-term ecological and paleoclimatic variables. It is a sign of self-similarity that is often observed in nature, which can be used in the development of more reliable nowcasting models of extreme events.
  • Recommender Systems Evaluator: A Framework for Evaluating the Performance of Recommender Systems

    dos Santos, Paulo V.G.; Tardiole Kuehne, Bruno; Batista, Bruno G.; Leite, Dionisio M.; Peixoto, Maycon L.M.; Moreira, Edmilson Marmo; Reiff-Marganiec, Stephan; University of Derby; Federal University of Itajubá, Itajubá, Brazil; Federal University of Mato Grosso do Sul (UFMS), Ponta Porã, Brazil; et al. (Springer, 2021-06-05)
    Recommender systems are filters that suggest products of interest to customers, which may positively impact sales. Nowadays, there is a multitude of algorithms for recommender systems, and their performance varies widely. So it is crucial to choose the most suitable option given a situation, but it is not a trivial task. In this context, we propose the Recommender Systems Evaluator (RSE): a framework aimed to accomplish an offline performance evaluation of recommender systems. We argue that the usage of a proper methodology is crucial when evaluating the available options. However, it is frequently overlooked, leading to inconsistent results. To help appraisers draw reliable conclusions, RSE is based on statistical concepts and displays results intuitively. A comparative study of classical recommendation algorithms is presented as an evaluation, highlighting RSE’s critical features.
  • Transforming product labels using digital technologies to enable enhanced traceability and management of hazardous chemicals

    Takhar, Sukhraj; Liyanage, Kapila; University of Derby (Inderscience, 2021-06-08)
    Manufacturers that produce, distribute or market physical products are likely to be impacted by numerous chemical and product regulations. Manufacturers must identify chemical substances which appear within mixtures, materials, formulations, raw materials, components, assemblies and finished products. This results in a very manual and resource intensive process of collection of chemical substances in products data, where definitions arise from internal, industry standards, supplier and customer requirements and often sourced from multiple supply chain actors. This paper contributes to existing literature by identifying a research gap in transforming current manual state data collection tasks via the utilisation of digital technologies, leveraging real-time data collection using smart labels to identify chemicals contained within products. The proposed design enables manufacturers to identify the use of chemicals consumed in a automated manner and enabling appropriate risks to be identified and managed accordingly. The design can be further expanded in the proposed collaborative data sharing network.
  • Realignment of Product Stewardship towards Chemical Regulations, the Circular Economy and Corporate Social Responsibility – a Delphi Study

    Liyanage, Kapila; Takhar, Sukhraj; University of Derby (Sepuluh Nopember Institute of Technology (ITS), 2021-07)
    Chemical regulations exist to limit and control the amount of hazardous chemical substances being used by industry. Increasing awareness of diminishing natural resources, increasing pollution, and reducing the amounts of harmful waste, has led towards increasing societal and regulatory pressure on industry to change from the traditional closed-loop manufacturing towards the adoption of sustainable materials and open-loop manufacturing systems as part of the Circular Economy. Corporate Social Responsibility (CSR) extends the relationship between industry and society. Product Stewardship (PS) provides a platform for organizations to assess impacts to manufacturing systems ensuring adequate measures are in place to understand, control or limit any impact(s) from manufacturing and using products. The research question answered in this paper relates to understanding the impacts on PS. This paper has been written based on a literature review and Delphi study. The outcomes from this paper will attempt to outline a framework for PS to align with Chemical Regulations, the Circular Economy and CSR.
  • On Generalized Lucas Pseudoprimality of Level k

    Andrica, Dorin; Bagdasar, Ovidiu; Babeş-Bolyai University, 400084 Cluj-Napoca, Romania; University of Derby (MDPI AG, 2021-04-12)
    We investigate the Fibonacci pseudoprimes of level k, and we disprove a statement concerning the relationship between the sets of different levels, and also discuss a counterpart of this result for the Lucas pseudoprimes of level k. We then use some recent arithmetic properties of the generalized Lucas, and generalized Pell–Lucas sequences, to define some new types of pseudoprimes of levels k+ and k− and parameter a. For these novel pseudoprime sequences we investigate some basic properties and calculate numerous associated integer sequences which we have added to the Online Encyclopedia of Integer Sequences.
  • On k-partitions of multisets with equal sums

    Andrica, Dorin; Bagdasar, Ovidiu; Babeş-Bolyai University of Cluj-Napoca, Cluj-Napoca, Romania; University of Derby (Springer Science and Business Media LLC, 2021-05-05)
    We study the number of ordered k-partitions of a multiset with equal sums, having elements α1,…,αn and multiplicities m1,…,mn. Denoting this number by Sk(α1,…,αn;m1,…,mn), we find the generating function, derive an integral formula, and illustrate the results by numerical examples. The special case involving the set {1,…,n} presents particular interest and leads to the new integer sequences Sk(n), Qk(n), and Rk(n), for which we provide explicit formulae and combinatorial interpretations. Conjectures in connection to some superelliptic Diophantine equations and an asymptotic formula are also discussed. The results extend previous work concerning 2- and 3-partitions of multisets.
  • Mechanical Engineering Design, Does the Past Hold the key to the Future?

    Sole, Martin; Ian, Turner; Barber, Patrick; University of Derby (The Design Society, 2021)
    Industry design of a complex product has always required a cross-disciplinary team of experts. Is it possible to mimic these teams in academia when training the design engineers of the future, and what disciplinary skills will they possess? The exceptional collaboration potential provided by the internet means industry experts can work as a team, and at the same time, reside anywhere in the world. What are the capabilities of teamwork when the team members may never see each other for real? Though a physical prototype is sometimes required, most prototypes are designed and created in the virtual world using 3D modelling. The model can be tested, checked for accuracy, have materials applied, and be created parametrically which allows the products geometry to be reset to different sizes by the designer. Collaboration, effective communication and 3D modelling make it possible to design intricate and complex designs remotely. While we rightly congratulate ourselves on the complexity of modern design and how clever we have become, we must not lose sight of past achievements. Design has become more complex in this modern age, but it would be incorrect to say that complex design did not exist in times past. Before the internet, aircraft were built, global communication systems existed, men went to the moon. What can we learn, if anything, by looking at the methods used to design complex products in the past? How can we apply what we learnt from the past to the future?
  • Design Education - A Reversed Method to Fill and Information and Knowledge Gap Between Full-Time and Part-Time Students

    Sole, Martin; Barber, Patrick; Ian, Turner; University of Derby (The Design Society, 2021-08)
    Teachers in schools, tutors in colleges, and lecturers in universities are all required to have specific teaching qualifications. As part of the qualification, it is normal to study tried and tested pedological theories. Some examples are Bloom’s Taxonomy, Constructivism, and Experiential Learning. This paper identifies a gap in the information and knowledge required of student design engineers studying on a full-time course, when compared to part-time students. To redress this gap, it is suggested that no new theories are required but just a new method of applying an old theory, the application of Bloom’s Taxonomy in reverse alongside reverse engineering. An example of applying this method to a class of design engineers in their final year of a BEng (Hons) Mechanical Engineering is provided.
  • Performance evaluation of machine learning techniques for fault diagnosis in vehicle fleet tracking modules

    Sepulevene, Luis; Drummond, Isabela; Kuehne, Bruno Tardiole; Frinhani, Rafael; Filho, Dionisio Leite; Peixoto, Maycon; Reiff-Marganiec, Stephan; Batista, Bruno; Federal University of Itajubá, Itajubá, Brazil; Federal University of Mato Grosso do Sul, Ponta Porã, Brazil; et al. (Oxford University Press, 2021-05-14)
    With industry 4.0, data-based approaches are in vogue. However, extracting the essential features is not a trivial task and greatly influences the fi nal result. There is also a need for specialized system knowledge to monitor the environment and diagnose faults. In this context, the diagnosis of faults is signi cant, for example, in a vehicle fleet monitoring system, since it is possible to diagnose faults even before the customer is aware of the fault, minimizing the maintenance costs of the modules. In this paper, several models using Machine Learning (ML) techniques were applied and analyzed during the fault diagnosis process in vehicle fleet tracking modules. Two approaches were proposed, "With Knowledge" and "Without Knowledge", to explore the dataset using ML techniques to generate classi fiers that can assist in the fault diagnosis process. The approach "With Knowledge" performs the feature extraction manually, using the ML techniques: Random Forest, Naive Bayes, Support Vector Machine (SVM) and Multi Layer Perceptron (MLP); on the other hand, the approach "Without Knowledge" performs an automatic feature extraction, through a Convolutional Neural Network (CNN). The results showed that the proposed approaches are promising. The best models with manual feature extraction obtained a precision of 99.76% and 99.68% for detection and detection and isolation of faults, respectively, in the provided dataset. The best models performing an automatic feature extraction obtained respectively 88.43% and 54.98% for detection and detection and isolation of failures.

View more