KEILILER AND HUNT. FUlZY MEMBERSHIP FUNCTIONS IN PERCEPTRON ALGORITHM Incorporating Fuzzy Membership Functions into the Perceptron Algorithm. Keller, J., M.; Hunt, D., J.; Keller, J., M.; and Hunt, D., J. Technical Report
KEILILER AND HUNT. FUlZY MEMBERSHIP FUNCTIONS IN PERCEPTRON ALGORITHM Incorporating Fuzzy Membership Functions into the Perceptron Algorithm [pdf]Paper  abstract   bibtex   
The perceptron algorithm, one of the class of gradient descent techniques, has been widely used in pattern recognition to determine linear decision boundaries. While this algorithm is guaranteed to converge to a separating hyperplane if the data are linearly separable, it exhibits erratic behavior if the data are not linearly separable. Fuzzy set theory is introduced into the perceptron algorithm to produce a "fuzzy algorithm" which ameliorates the convergence problem in the nonseparable case. It is shown that the fuzzy perceptron, like its crisp counterpart, converges in the separable case. A method of generating membership functions is developed, and experimental results comparing the crisp to the fuzzy perceptron are presented. Index Terms-Fuzzy sets, fuzzy 2-means, gradient descent, induced fuzzy membership, iterative training, perceptron algorithm, separating hyperplane. I. INTRODUCrION T HERE are many cases in pattern classifier design where a linear decision boundary between two sets of .ample vectors is desired. One of the common approaches to this problem is to use the perceptron algorithm originated by Rosenblatt [1] as a model of machine learning. This algorithm is one of a class of gradient-descent techniques which play an important role in pattern recognition theory [2]. The classical perceptron technique is an iterative training algorithm which, given two classes of patterns (vec-tors in Euclidian space), attempts to determine a linear decision boundary separating the two classes. If the two sets of vectors are, in fact, linearly separable, the percep-tron algorithm is guaranteed to find a separating hyper-plane in a finite number of steps [2]. However, if the two sets of vectors are not linearly separable, not only will the perceptron algorithm not find a separating hyperplane (since one does not exist), but there is no method for knowing when to terminate the algorithm to obtain an optimal or even a good decision boundary. Depending on the values of the sample vectors, the behavior of the percep-tron algorithm can be very erratic in the nonseparable case. It is this problem with the nonseparable case that we address by incorporating fuzzy set theory into the perceptron algorithm.
@techreport{
 title = {KEILILER AND HUNT. FUlZY MEMBERSHIP FUNCTIONS IN PERCEPTRON ALGORITHM Incorporating Fuzzy Membership Functions into the Perceptron Algorithm},
 type = {techreport},
 id = {c44a8694-2477-3d59-a006-598223709b7e},
 created = {2019-02-19T15:17:43.685Z},
 file_attached = {true},
 profile_id = {26832b0b-3ab6-37d8-9079-c712ce3c7c66},
 group_id = {9fcff044-d9db-3ece-9111-31d3cb8df27d},
 last_modified = {2019-02-19T15:17:44.567Z},
 read = {false},
 starred = {false},
 authored = {false},
 confirmed = {false},
 hidden = {false},
 private_publication = {false},
 abstract = {The perceptron algorithm, one of the class of gradient descent techniques, has been widely used in pattern recognition to determine linear decision boundaries. While this algorithm is guaranteed to converge to a separating hyperplane if the data are linearly separable, it exhibits erratic behavior if the data are not linearly separable. Fuzzy set theory is introduced into the perceptron algorithm to produce a "fuzzy algorithm" which ameliorates the convergence problem in the nonseparable case. It is shown that the fuzzy perceptron, like its crisp counterpart, converges in the separable case. A method of generating membership functions is developed, and experimental results comparing the crisp to the fuzzy perceptron are presented. Index Terms-Fuzzy sets, fuzzy 2-means, gradient descent, induced fuzzy membership, iterative training, perceptron algorithm, separating hyperplane. I. INTRODUCrION T HERE are many cases in pattern classifier design where a linear decision boundary between two sets of .ample vectors is desired. One of the common approaches to this problem is to use the perceptron algorithm originated by Rosenblatt [1] as a model of machine learning. This algorithm is one of a class of gradient-descent techniques which play an important role in pattern recognition theory [2]. The classical perceptron technique is an iterative training algorithm which, given two classes of patterns (vec-tors in Euclidian space), attempts to determine a linear decision boundary separating the two classes. If the two sets of vectors are, in fact, linearly separable, the percep-tron algorithm is guaranteed to find a separating hyper-plane in a finite number of steps [2]. However, if the two sets of vectors are not linearly separable, not only will the perceptron algorithm not find a separating hyperplane (since one does not exist), but there is no method for knowing when to terminate the algorithm to obtain an optimal or even a good decision boundary. Depending on the values of the sample vectors, the behavior of the percep-tron algorithm can be very erratic in the nonseparable case. It is this problem with the nonseparable case that we address by incorporating fuzzy set theory into the perceptron algorithm.},
 bibtype = {techreport},
 author = {Keller, James M and Hunt, Douglas J and Keller, J M and Hunt, D J}
}
Downloads: 0