Browsing by Author "McGinnity, T.M."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Open Access AdaBoost-CNN: An Adaptive Boosting algorithm for Convolutional Neural Networks to classify Multi-Class Imbalanced datasets using Transfer Learning(Elsevier, 2020-05-12) Taherkhani, Aboozar; Cosma, Georgina; McGinnity, T.M.Ensemble models achieve high accuracy by combining a number of base estimators and can increase the reliability of machine learning compared to a single estimator. Additionally, an ensemble model enables a machine learning method to deal with imbalanced data, which is considered to be one of the most challenging problems in machine learning. In this paper, the capability of Adaptive Boosting (AdaBoost) is integrated with a Convolutional Neural Network (CNN) to design a new machine learning method, AdaBoost-CNN, which can deal with large imbalanced datasets with high accuracy. AdaBoost is an ensemble method where a sequence of classifiers is trained. In AdaBoost, each training sample is assigned a weight, and a higher weight is set for a training sample that has not been trained by the previous classifier. The proposed AdaBoost-CNN is designed to reduce the computational cost of the classical AdaBoost when dealing with large sets of training data, through reducing the required number of learning epochs for its ingredient estimator. AdaBoost-CNN applies transfer learning to sequentially transfer the trained knowledge of a CNN estimator to the next CNN estimator, while updating the weights of the samples in the training set to improve accuracy and to reduce training time. Experimental results revealed that the proposed AdaBoost-CNN achieved 16.98% higher accuracy compared to the classical AdaBoost method on a synthetic imbalanced dataset. Additionally, AdaBoost-CNN reached an accuracy of 94.08% on 10,000 testing samples of the synthetic imbalanced dataset, which is higher than the accuracy of the baseline CNN method, i.e. 92.05%. AdaBoost-CNN is computationally efficient, as evidenced by the fact that the training simulation time of the proposed method is 47.33 seconds, which is lower than the training simulation time required for a similar AdaBoost method without transfer learning, i.e. 225.83 seconds on the imbalanced dataset. Moreover, when compared to the baseline CNN, AdaBoost-CNN achieved higher accuracy when applied to five other benchmark datasets including CIFAR-10 and Fashion-MNIST. AdaBoost-CNN was also applied to the EMNIST datasets, to determine its impact on large imbalanced classes, and the results demonstrate the superiority of the proposed method compared to CNN.Item Open Access A Deep Convolutional Neural Network for Time Series Classification with Intermediate Targets(Springer, 2023-10-28) Taherkhani, Aboozar; Cosma, Georgina; McGinnity, T.M.Deep Convolutional Neural Networks (CNNs) have been successfully used in different applications, including image recognition. Time series data, which are generated in many applications, such as tasks using sensor data, have different characteristics compared to image data and accordingly there is a need for specific CNN structures to address their processing. This paper proposes a new CNN for classifying time series data. It is proposed to have new intermediate outputs extracted from different hidden layers instead of having a single output to control weight adjustment in the hidden layers during training. Intermediate targets are used to act as labels for the intermediate outputs to improve the performance of the method. The intermediate targets are different from the main target. Additionally, the proposed method artificially increases the number of training instances using the original training samples and the intermediate targets. The proposed approach converts a classification task with original training samples to a new (but equivalent) classification task that contains two classes with a high number of training instances. The proposed CNN for Time Series classification, called CNN-TS, extracts features depending the distance of two time series. CNN-TS was evaluated on various benchmark time series datasets. The proposed CNN-TS achieved 3.5% higher overall accuracy compared to the CNN base method (without an intermediate layer). Additionally, CNN-TS achieved 21.1% higher average accuracy compared to classical machine learning methods, i.e. linear SVM, RBF SVM, and RF. Moreover, CNN-TS was on average 8.43 times faster in training time compared to the ResNet method.Item Open Access A review of learning in biologically plausible spiking neural networks(Elsevier, 2019-10-11) Taherkhani, Aboozar; Belatreche, Ammar; Li, Yuhua; Cosma, Georgina; Maguire, Liam P.; McGinnity, T.M.Artificial neural networks have been used as a powerful processing tool in various areas such as pattern recognition, control, robotics, and bioinformatics. Their wide applicability has encouraged researchers to improve artificial neural networks by investigating the biological brain. Neurological research has significantly progressed in recent years and continues to reveal new characteristics of biological neurons. New technologies can now capture temporal changes in the internal activity of the brain in more detail and help clarify the relationship between brain activity and the perception of a given stimulus. This new knowledge has led to a new type of artificial neural network, the Spiking Neural Network (SNN), that draws more faithfully on biological properties to provide higher processing abilities. A review of recent developments in learning of spiking neurons is presented in this paper. First the biological background of SNN learning algorithms is reviewed. The important elements of a learning algorithm such as the neuron model, synaptic plasticity, information encoding and SNN topologies are then presented. Then, a critical review of the state-of-the-art learning algorithms for SNNs using single and multiple spikes is presented. Additionally, deep spiking neural networks are reviewed, and challenges and opportunities in the SNN field are discussed.