A study on Neural Network Pattern Recognition

Monisha Nagpal

A Study on Neural Network Pattern Recognition

Abstract

Among the various traditional approaches of pattern recognition the statistical approach has been most intensively studied and used in practice. More recently, the addition of neural network techniques theory have been receiving significant attention. The design of a recognition system requires careful attention to the following issues: definition of pattern classes, sensing environment, pattern representation, feature extraction and selection, cluster analysis, classifier design and learning, selection of training and test samples, and performance evaluation. In spite of almost 50 years of research and development in this field, the general problem of recognizing complex patterns with arbitrary orientation, location, and scale remains unsolved. New and emerging applications, such as data mining, web searching, retrieval of multimedia data, face recognition, and cursive handwriting recognition, require robust and efficient pattern recognition techniques. The objective of this synopsis paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system using NN and identify research topics and applications which are at the forefront of this exciting and challenging field.

Keywords: Pattern Recognition, correlation, Neural Network.

INTRODUCTION

A pattern is an entity, vaguely defined, that could be given a name, e.g.

Fingerprint image,

Handwritten word,

Human face,

Speech signal,

IDNA sequence.

Pattern recognition is the study of how machines canobserve the environment,learn to distinguish patterns of interest,Make sound and reasonable decisions about the categories of the patterns.

The term neural network was traditionally used to refer to a network or circuit of neurons. The modern usage of the term often refers to artificial neural networks, which are composed of artificial neurons or nodes. Thus the term has two distinct usages. To provide an easier understanding of neural networks, we will begin by telling about the natural (biological) neural network (the brain), since the artificial neural network is actually a mathematic model of the brain.

Pattern recognition techniques are concerned with the theory and algorithms of putting abstract objects, e.g., measurements made on physical objects, into categories. Typically the categories are assumed to be known in advance, although there are techniques to learn the categories (clustering). Methods of pattern recognition are useful in many applications such as information retrieval, data mining, document image analysis etc.

The term pattern recognition can be done by the using of Bayesian probability theorem. The recognition of the pattern is summarized by the diagram shown below.

PATTERN RECOGNITION SYSTEM

The pattern recognition in neural network includes a Pattern Recognition System which is used for extraction of the patterns, can be explained by the following diagram and description.

DATA ACQUISITION AND SENSING:

Measurements of physical variables.

Important issues: bandwidth, resolution, sensitivity, distortion, SNR, latency, etc.

Pre-processing:

Removal of noise in data.

Isolation of patterns of interest from the background.

Feature extraction:

Finding a new representation in terms of features.

Model learning and estimation:

Learning a mapping between features and pattern groups and categories.

Classification:

Using features and learned models to assign a pattern to a category.

Post-processing:

Evaluation of confidence in decisions.

Exploitation of context to improve performance.

DESIGN CYCLE

The design cycle of the pattern recognition is the flow diagram in which the data to be recognized is flowed and recognized.

Data collection: it is the process of collecting training and testing data. It even helps us to know the adequately large and representative set of samples including the data to be recognized.

Feature Selection: the next step in the design cycle includes is the select features which extract features as Domain dependence and prior information, Computational cost and feasibility, Discriminative features(Similar values for similar patterns, Different values for different patterns),Invariant features with respect to translation, rotation and scale, Robust features with respect to occlusion, distortion, deformation, and variations in environment.

Model selection: the next level is the selection of model which includes the domain dependence and prior information, Definition of design criteria, Parametric vs. non-parametric models, Handling of missing features, Computational complexity, Types of models: templates, decision-theoretic or statistical, syntactic or structural, neural, and hybrid.

Training: The next phase is the training phase which includes the ways in which we learn the rule from data, even differentiate in supervised and unsupervised learning. The Supervised learning is one in which a teacher provides a category label or cost for each pattern in the training set. Whereas unsupervised learning is the system forms clusters or natural groupings of the input patterns. Even the training includes the Reinforcement learning which is no desired category is given but the teacher provides feedback to the system such as the decision is right or wrong.

Evaluation: the last phase is the evaluation phase which determines the performance with training samples and how the performance is predicted with future data. Even with this it includes the problems of over fitting and generalization.

PROBABILITY

Probabilities are numbers assigned to events that indicate “how likely” it

is that the event will occur when a random experiment is performed

Conditional Probability

If A and B are two events, the probability of event A when we already know that event B has occurred P[A|B] is defined by the relation

P[A|B] is read as the “conditional probability of A conditioned on B”,

or simply the “probability of A given B”

Bayes Theorem

Bayes Theorem is definitely the

Fundamental relationship in

Statistical Pattern Recognition which is obtained as

Given B1, B2… BN, a partition of the sample space S. Suppose that event A occurs; what is the probability of event Bj

Using the definition of conditional probability and the Theorem of total

probability we obtain

For pattern recognition, Bayes Theorem can be expressed a

Where ωj is the ith class and x is the feature vector

Each term in the Bayes Theorem has a special name as

P (ωi) Prior probability (of class ωi) P (ωi|x) Posterior Probability (of class ωi given the observation x)

P (x|ωi) Likelihood (conditional prob. of x given class ωi)

P(x) a normalization constant that does not affect the decision

CONCLUSION

Pattern recognition is a human activity that we try to imitate by mechanical means. There are no physical laws that assign observations to classes. It is the human consciousness that groups observations together. Although their connections and inter-relations are often hidden, by the attempt of imitating this process, some understanding might be gained. The human process of learning patterns from examples may follow along the lines of trial and error. It has, however, to be strongly doubted whether statistics play an important role in this process. Estimating probabilities, especially in multi-variate situations is not very intuitive for majority of people. Moreover, the large amount of examples needed to build a reliable classifier by statistical means is much larger than it is available for human learning. In human recognition, proximities based on relations between objects seem to come before features are searched and may be, thereby, more fundamental.

For this reason and the above observation we think that the study of dissimilarities, distances and domain based classifiers are of great interest. This is further encouraged by the fact that such representations offer a bridge between the possibilities of learning in vector spaces and the structural description of objects that preserve relations between object’s inherent structures. We think that the use of dissimilarities for representation, generalization and evaluation constitute the most intriguing issues in pattern recognition.

REFRENCES:

1. GRAVA-acorpusbasedapproachtotheinterpretationofaerialimages

2. Anewconceptforwirelessreconfigurablereceivers Palicot, J.; Roland, C.; Communications Magazine, IEEE
3. ParallelPatternRecognitionUsingaSingle-CycleLearningApproachwithinWirelessSensorNetworks Amin, A.H.M.; Khan, Robertson, P.; Image Processing and Its Applications, 1999. A.I.
4. A handoffalgorithmforwireless systems usingpatternrecognition Narasimhan, R.; Cox, D.C.; Personal, Indoor and Mobile Radio Communications, 1998.
5. EarthObservationRemoteSensingandGISServicesforMonitoringof Integration Systems Kurnaz, S.; Rustamov, R.B.;
Recent Advances in Space Technologies, 2007.