NN#
- class frlearn.classifiers.NN(k: int = <function at_most.<locals>._f>, dissimilarity: str = 'boscovich', distance_weighted: bool = False, nn_search: ~frlearn.neighbours.neighbour_search_methods.NeighbourSearchMethod = <frlearn.neighbours.neighbour_search_methods.KDTree object>, preprocessors=(<frlearn.statistics.feature_preprocessors.RangeNormaliser object>, ))#
Implementation of Nearest Neighbour (NN) classification.
- Parameters:
- k: int or (int -> float) or None = at_most(5)
Number of neighbours to consider. Should be either a positive integer, or a function that takes the dataset size
nand returns a float, orNone, which is resolved asn. All such values are rounded to the nearest integer in[1, n].- dissimilarity: str or float or (np.array -> float) or ((np.array, np.array) -> float) = ‘boscovich’
The dissimilarity measure to use. The similarity between two instances is calculated as 1 minus their dissimilarity.
A vector size measure
np.array -> floatinduces a dissimilarity measure through application toy - x. A float is interpreted as Minkowski size with the corresponding value forp. For convenience, a number of popular measures can be referred to by name.- distance_weighted: boolean = False
If
True, NN with reciprocally linear distance weights. IfFalse, unweighted NN.- nn_searchNeighbourSearchMethod = KDTree()
Nearest neighbour search algorithm to use.
- preprocessorsiterable = (RangeNormaliser(), )
Preprocessors to apply. The default range normaliser ensures that all features have range 1.
References
- class Model#