Explore chapters and articles related to this topic
Introduction to Neural Networks for Signal Processing
Published in Yu Hen Hu, Jenq-Neng Hwang, Handbook of Neural Network Signal Processing, 2018
This handbook is organized into three complementary parts: neural network fundamentals, neural network solutions to statistical signal processing problems, and signal processing applications using neural networks. In the first part, in-depth surveys of recent progress of neural network computing paradigms are presented. Part One consists of five chapters: Chapter 1: Introduction to Neural Networks for Signal Processing. This chapter has provided an overview of topics discussed in this handbook so that the reader is better prepared for the in-depth discussion in later chapters.Chapter 2: Signal Processing Using the Multilayer Perceptron. In this chapter, Manry, Chandrasekaran, and Hsieh discuss the training strategies of the multilayer perceptron and methods to estimate testing error from the training error. A potential application of MLP to flight load synthesis is also presented.Chapter 3: Radial Basis Functions. In this chapter, Back presents a complete review of the theory, algorithm, and five real world applications of radial basis network: time series modeling, option pricing in the financial market, phoneme classification, channel equalization, and symbolic signal processing.Chapter 4: An Introduction to Kernel-Based Learning Algorithms. In this chapter, Müller, Mika, Rätsch, Tsuda, and Schölkopf introduce three important kernel-based learning algorithms: support vector machine, kernel Fisher discriminant analysis, and kernel PCA. In addition to clear theoretical derivations, two impressive signal processing applications, optical character recognition and DNA sequencing analysis, are presented.Chapter 5: Committee Machines. Tresp gives three convincing arguments in this chapter as to why a committee machine is important: (a) performance enhancement using averaging, bagging, and boosting; (b) modularity with a mixture of expert networks; and (c) computation complexity reduction as illustrated with the introduction of a Bayesian committee machine.
Integrating Feature Extractors for the Estimation of Human Facial Age
Published in Applied Artificial Intelligence, 2019
The kernel fisher discriminant analysis (KFA) method (Liu 2006) is an extension of fisher discriminant analysis (FDA). In this approach as a first step, input space will be expanded by using a nonlinear mapping, and then in the obtained feature space, the multiclass FDA will be applied. By implementing nonlinear mapping the dimensionality of feature space will be increased and as a result, it improves the discriminative ability of the KFA method. The main advantage of the KFA method is that it can be applied for multiclass pattern classification problems, and its solution is unique which is its superiority to Generalized Discriminant Analysis (GDA) (Liu 2006) method which produces multiple solutions.