NN#

class frlearn.classifiers.NN(k: int = <function at_most.<locals>._f>, dissimilarity: str = 'boscovich', distance_weighted: bool = False, nn_search: ~frlearn.neighbours.neighbour_search_methods.NeighbourSearchMethod = <frlearn.neighbours.neighbour_search_methods.KDTree object>, preprocessors=(<frlearn.statistics.feature_preprocessors.RangeNormaliser object>, ))#

Implementation of Nearest Neighbour (NN) classification.

Parameters:
k: int or (int -> float) or None = at_most(5)

Number of neighbours to consider. Should be either a positive integer, or a function that takes the dataset size n and returns a float, or None, which is resolved as n. All such values are rounded to the nearest integer in [1, n].

dissimilarity: str or float or (np.array -> float) or ((np.array, np.array) -> float) = ‘boscovich’

The dissimilarity measure to use. The similarity between two instances is calculated as 1 minus their dissimilarity.

A vector size measure np.array -> float induces a dissimilarity measure through application to y - x. A float is interpreted as Minkowski size with the corresponding value for p. For convenience, a number of popular measures can be referred to by name.

distance_weighted: boolean = False

If True, NN with reciprocally linear distance weights. If False, unweighted NN.

nn_searchNeighbourSearchMethod = KDTree()

Nearest neighbour search algorithm to use.

preprocessorsiterable = (RangeNormaliser(), )

Preprocessors to apply. The default range normaliser ensures that all features have range 1.

References

class Model#

Examples using frlearn.classifiers.NN#

Multiclass classification with NN

Multiclass classification with NN