انت هنا الان : شبكة جامعة بابل > موقع الكلية > نظام التعليم الالكتروني > مشاهدة المحاضرة

K-Nearest Neighbor Algorithm (KNN):

الكلية كلية العلوم للبنات     القسم قسم الحاسبات     المرحلة 4
أستاذ المادة زينب فلاح حسن الكيم       09/04/2019 12:04:55
K-Nearest Neighbor Algorithm (KNN):
A classifier that is a more common version of the nearest neighbor technique bases on the classification of an unknown sample on the votes of k of its nearest neighbors rather than on only its single nearest neighbor, is the k-Nearest Neighbor classifier, which is indicated by K-NN .
The K-Nearest Neighbor technique is a nonparametric supervised pattern classifier, which is simple, yet yields good classification accuracy. It is a method for classifying objects based on closest training examples in the feature space. The k-nearest neighbor algorithm is amongst the simplest of all machine learning algorithms : an object is classified by a majority vote of its neighbors, then object assigned to the class to which majority of its k-nearest neighbors belong.
In general, the training examples are vectors in a feature space, each with a class label. The training phase of K-NN algorithm consists only of storing the feature vectors and class labels of the training samples. In the classification phase, k is a user defined constant, and unlabelled vector (a query or test point) is classified by assigning the label which is most frequent among the k training samples nearest to that query point.

A classifier that is a more common version of the nearest neighbor technique bases on the classification of an unknown sample on the votes of k of its nearest neighbors rather than on only its single nearest neighbor, is the k-Nearest Neighbor classifier, which is indicated by K-NN .
The K-Nearest Neighbor technique is a nonparametric supervised pattern classifier, which is simple, yet yields good classification accuracy. It is a method for classifying objects based on closest training examples in the feature space. The k-nearest neighbor algorithm is amongst the simplest of all machine learning algorithms : an object is classified by a majority vote of its neighbors, then object assigned to the class to which majority of its k-nearest neighbors belong.
In general, the training examples are vectors in a feature space, each with a class label. The training phase of K-NN algorithm consists only of storing the feature vectors and class labels of the training samples. In the classification phase, k is a user defined constant, and unlabelled vector (a query or test point) is classified by assigning the label which is most frequent among the k training samples nearest to that query point.


المادة المعروضة اعلاه هي مدخل الى المحاضرة المرفوعة بواسطة استاذ(ة) المادة . وقد تبدو لك غير متكاملة . حيث يضع استاذ المادة في بعض الاحيان فقط الجزء الاول من المحاضرة من اجل الاطلاع على ما ستقوم بتحميله لاحقا . في نظام التعليم الالكتروني نوفر هذه الخدمة لكي نبقيك على اطلاع حول محتوى الملف الذي ستقوم بتحميله .