Abstract
The present thesis is dealing with the study and the development of classification
models that are based on Probabilistic Neural Networks (PNN). The proposed
models were developed by the incorportation of statistical methods as well as
methods from several fields of Computational Intelligence (CI) into PNNs. The
presentation of the subjects and the results of the dissertation is organized as
follows:
In Chapter 1 the required theoretical elements of the statistical decision theory
in classification tasks is presented. Moreover, a summary of the most common
decision rules and discriminant functions is provided.
Chapter 2 is devoted in the presentation of the concepts that consist CI. Special
credit is given to the optimization methods of CI and especially to Particle Swarm
Optimization (PSO) and Differential Evolution Algorithms (DEA). Furthermore,
Artificial Neural Networks are briefly presented and a thorough presentation about
PNNs is provided regarding the structure, th ...
The present thesis is dealing with the study and the development of classification
models that are based on Probabilistic Neural Networks (PNN). The proposed
models were developed by the incorportation of statistical methods as well as
methods from several fields of Computational Intelligence (CI) into PNNs. The
presentation of the subjects and the results of the dissertation is organized as
follows:
In Chapter 1 the required theoretical elements of the statistical decision theory
in classification tasks is presented. Moreover, a summary of the most common
decision rules and discriminant functions is provided.
Chapter 2 is devoted in the presentation of the concepts that consist CI. Special
credit is given to the optimization methods of CI and especially to Particle Swarm
Optimization (PSO) and Differential Evolution Algorithms (DEA). Furthermore,
Artificial Neural Networks are briefly presented and a thorough presentation about
PNNs is provided regarding the structure, the operation, the usefulness and their
various applications. Several known variants of PNNs are also exhibited.
Chapter 3 provides a brief description of the typical resampling methods that
are necessary for machine learning classification problems. Moreover, the required
methodology for the statistical comparisons of classification algorithms on one
or several application tasks is presented.
In Chapter 4 a novel class of classification models that comprise variants of
PNNs is proposed. In particular, evolutionary optimization algorithms are incorporated
into PNN for the pursuit of promising values for the spread parameters of
its kernel functions. For this purpose, PSO and DEA are employed and the new
models are named Evolutionary PNNs (EPNN).
In the next chapter, a list of improvements for EPNNs is proposed regarding
their performance and required training time. Using unsupervised clustering methods,
a new Improved EPNN (IEPNN) is constructed that requires much shorter
training time. For further improvement of EPNN’s performance, the bagging technique
is also employed. Moreover, a different spread parameters’ matrix of PNNs’
kernels is used for every class of the available data.
In Chapter 6 a brief summary of the fundamental concepts of Bayesian Analysis
is provided. Afterwards, a Bayesian model is proposed for the estimation
of PNN’s spread parameters where the estimation is achieved by Gibbs sampler.
The aforementioned model is incorporated into PNNs and EPNNs, proposing a new
class of models that are named Bayesian PNNs (BPNN). Moreover, we study the
viii
use of Epanechnikov’s kernel function besides the normal kernel.
In the first part of Chapter 7 a short review on the theory of Fuzzy Sets is
provided. A Fuzzy Membership Function is employed for the further improvement
of EPNN’s performance in binary classification tasks and the proposed model
is named Fuzzy EPNN (FEPNN). Furthermore, we propose a new decomposition
algorithm that converts multi–class classification problems into multiple binary
classification ones. Utilizing this algorithm, FEPNNs can also be applied on multi–
class classification problems.
This dissertation is completed with Chapter 8 and Appendix A. In the last
chapter, a comparison between all the novel models takes place. Moreover, the
proposed models are compared to the model that has achieved the greatest performance
ever for each classification problem. In Appendix A, we provide a short
description of all the classification problems that were used in this thesis for the
evaluation of the proposed models
show more