3.8 Robust Training#

This chapter will review two foundamental concepts and strategies for training machine learning models:

  1. Cross-Validation

  2. Hyper-parameter tuning

1. Cross validation#

Cross-validation is a widely used technique in machine learning to assess the performance and generalization of a model. It involves partitioning the dataset into multiple subsets, training the model on a portion of the data, and then evaluating its performance on the remaining data. This process is repeated several times, and the results are averaged to obtain a more reliable estimate of the model’s performance. The primary goal is to mitigate the risk of overfitting and obtain a more robust evaluation metric.

There are several tutorials on cross-validation.

Cross-validation divides the data set between a training set and a validation set:

Validation Set Approach From: scikit-learn: concept for training and validation set.

Often the validation set ends up underestimating the prediction errors (model uncertainties) because the validation set is often smaller than the training set. To alleviate that, we can perform cross-validation over many folds of selecting a validation and training set.

Validation Set Approach From: scikit-learn

  • Data Splitting: Cross-validation involves partitioning a dataset into multiple subsets, typically a training set and a testing set. This splitting allows for model assessment and validation.

  • Estimation: Cross-validation is primarily used to assess the predictive performance of machine learning models or statistical models. It helps in estimating how well a model will generalize to new, unseen data by testing its performance on data that was not used in training.

  • Correlated Data: Cross-validation does not explicitly address correlated data, and its effectiveness can be influenced by the data splitting strategy. When data exhibits strong correlations, it’s essential to carefully design the cross-validation procedure to ensure that all subsets represent the overall data distribution.

  • Applications: Cross-validation is widely employed in model selection, hyperparameter tuning, and assessing the generalization ability of models.

We will now randomly select a training and validation set using the sklearn module on a GNSS time series and using a linear regression problem

import requests, zipfile, io, gzip, glob, os
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import requests
%matplotlib inline
# The station designation
sta="P395"

print("http://geodesy.unr.edu/gps_timeseries/tenv/IGS14/" + sta + ".tenv")
zip_file_url="http://geodesy.unr.edu/gps_timeseries/tenv/IGS14/"+ sta + ".tenv"
r = requests.get(zip_file_url)


# create a list of strings with itemized list above
ll = ['station ID (SSSS)','date (yymmmdd)',
'decimal year','modified Julian day','GPS week','day of GPS week',
'longitude (degrees) of reference meridian','delta e (m)',
'delta n (m)','delta v (m)','antenna height (m)',
'sigma e (m)','sigma n (m)','sigma v (m)',
'correlation en','correlation ev','correlation nv']
      

# transform r.content into a pandas dataframe
# first split r.content with \n separator
# Decode the content if it's in bytes
content_str = r.content.decode('utf-8')

# Split the content by the newline character
lines = content_str.split('\n')

# Now `lines` is a list of strings, each representing a line from the content
print(lines[0])

# then transform lines into a pandas dataframe
df = pd.DataFrame([x.split() for x in lines])
# assign column names to df a
df.columns = ll

#convert columns to numeric
df = df.apply(pd.to_numeric, errors='ignore')

df.dropna()
df.head()
http://geodesy.unr.edu/gps_timeseries/tenv/IGS14/P395.tenv
P395 06JAN25 2006.0671 53760 1359 3 -123.9  3347.67917   4987420.31375   53.03678  0.0083 0.00069 0.00105 0.00327 -0.04832  0.01695 -0.31816
station ID (SSSS) date (yymmmdd) decimal year modified Julian day GPS week day of GPS week longitude (degrees) of reference meridian delta e (m) delta n (m) delta v (m) antenna height (m) sigma e (m) sigma n (m) sigma v (m) correlation en correlation ev correlation nv
0 P395 06JAN25 2006.0671 53760.0 1359.0 3.0 -123.9 3347.67917 4.987420e+06 53.03678 0.0083 0.00069 0.00105 0.00327 -0.04832 0.01695 -0.31816
1 P395 06JAN26 2006.0698 53761.0 1359.0 4.0 -123.9 3347.68086 4.987420e+06 53.03003 0.0083 0.00069 0.00104 0.00321 -0.04648 0.00271 -0.30970
2 P395 06JAN27 2006.0726 53762.0 1359.0 5.0 -123.9 3347.68072 4.987420e+06 53.03906 0.0083 0.00069 0.00105 0.00326 -0.02367 0.00817 -0.31941
3 P395 06JAN28 2006.0753 53763.0 1359.0 6.0 -123.9 3347.67938 4.987420e+06 53.04382 0.0083 0.00069 0.00105 0.00324 -0.03681 0.00908 -0.30515
4 P395 06JAN29 2006.0780 53764.0 1360.0 0.0 -123.9 3347.68042 4.987420e+06 53.03513 0.0083 0.00068 0.00105 0.00328 -0.04815 0.00619 -0.33029
# remove first value for delta e, delta n, delta v to make relative position with respect to the first time. Add these as new columns
df['new delta e (m)'] = df['delta e (m)'] - df['delta e (m)'].values[0]
df['new delta n (m)'] = df['delta n (m)'] - df['delta n (m)'].values[0]
df['new delta v (m)'] = df['delta v (m)'] - df['delta v (m)'].values[0]
# drop nans in new delta e (m) and decimal year columns
df = df.dropna(subset=['new delta e (m)', 'decimal year'])
from sklearn.linear_model import LinearRegression
# convert the data into numpy arrays.
E = np.asarray(df['new delta e (m)']).reshape(-1, 1)# reshaping was necessary to be an argument of Linear regress
# E = np.asarray(df['east'][df['station']==sta]).reshape(-1, 1)# reshaping was necessary to be an argument of Linear regress
# make a new time array
t = np.asarray(df['decimal year']).reshape(-1, 1)
tt = np.linspace(np.min(t),np.max(t),1000)

# perform the linear regression. First we will use the entire available data
regr = LinearRegression()
# we will first perform the fit:
regr.fit(t,E)
# We will first predict the fit:
Epred=regr.predict(t) 

# The coefficients
print('Coefficient / Velocity eastward (mm/year): ', regr.coef_[0][0])
# plot the data
plt.plot(t,E,'b',label='data')
Coefficient / Velocity eastward (mm/year):  -0.006439731291127403
[<matplotlib.lines.Line2D at 0x3027d02b0>]
../_images/3.8_robust_training_5_2.png
# we randomly select values and split the data between training and validation set.
from sklearn.model_selection import ShuffleSplit
# we split once the data between a training and a validating set 
n=1 # we do this selection once
v_size = 0.3 # 30% of the data will be randomly selected to be the validation set.

rs = ShuffleSplit(n_splits=n, test_size=.3, random_state=0)
for train_index, val_index in rs.split(E):
    E_train, E_val = E[train_index], E[val_index]
    t_train, t_val = t[train_index], t[val_index]
plt.scatter(t_train,E_train,marker="o");plt.grid(True);plt.ylabel('East (mm)')
plt.scatter(t_val,E_val,marker="o",s=6,c="red")
plt.xlabel('Time (years)')
plt.title('East component')
plt.legend(['training set','validation set'])
<matplotlib.legend.Legend at 0x17c8eaac0>
../_images/3.8_robust_training_6_1.png
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error, r2_score
# now fit the data on the training set.
regr = LinearRegression()
# Fit on training data:
regr.fit(t_train,E_train)
# We will first predict the fit:
Epred=regr.predict(t_train) 
Epred_val=regr.predict(t_val) 

# The coefficients
print('Training set: Coefficient / Velocity eastward (mm/year): ', regr.coef_[0][0])

print('MSE (mean square error) on training set (mm): %.2f'
      % mean_squared_error(Epred, E_train))
# The coefficient of determination: 1 is the perfect prediction
print('Coefficient of determination on training set: %.2f'
      % r2_score(Epred, E_train))

print('MSE on validation set (mm): %.2f and coefficient of determiniation on %.2f' %(mean_squared_error(Epred_val, E_val), r2_score(Epred_val, E_val)))


plt.plot(t,E);plt.grid(True);plt.ylabel('East (mm)')
plt.plot(t_train,Epred,color="red",linewidth=4)
plt.plot(t_val,Epred_val,color="green")
plt.legend(['data','fit on training','fit on validation'])
plt.title('Random selection for data split')
Training set: Coefficient / Velocity eastward (mm/year):  -0.006437910455226973
MSE (mean square error) on training set (mm): 0.00
Coefficient of determination on training set: 0.99
MSE on validation set (mm): 0.00 and coefficient of determiniation on 0.99
Text(0.5, 1.0, 'Random selection for data split')
../_images/3.8_robust_training_7_2.png

2.5 Leave One Out Cross Validation#

LOOCV splits the data in 2 sets (training and validation) n times (n is the number of data points). At each repeat, the training set is all but one data, the validation set is one element.

Validation Set Approach Advantages: it has far less bias with respect to the training data. It does not overestimate the test error. Repeated LOOCV will give the exact same results.

Disadvantages: it is computationally intensive.

  • Data Splitting: LOOCV is an extreme form of cross-validation where, for each iteration, only one data point is left out as the test set, and the remaining data is used as the training set. This process is repeated for each data point, effectively creating as many folds as there are data points.

  • Estimation: LOOCV is primarily used for assessing model performance and estimating predictive accuracy. By evaluating the model against all data points one at a time, LOOCV provides a robust assessment of a model’s ability to generalize to unseen data.

  • Correlated Data: LOOCV, like other cross-validation methods, may not explicitly address correlated data. However, its performance can be influenced by the correlation structure in the data. For datasets with strong correlations, LOOCV may lead to models that are overly optimistic since it often tests a model on data points that are closely related to the training set.

  • Applications: LOOCV is valuable for evaluating machine learning and statistical models, particularly when you have a limited amount of data.

from sklearn.model_selection import LeaveOneOut
loo = LeaveOneOut()

vel = np.zeros(len(E)) # initalize a vector to store the regression values
mse_train = np.zeros(len(E))
mse_val = np.zeros(len(E))
r2s = np.zeros(len(E))
i=0
for train_index, test_index in loo.split(E):    
    E_train, E_val = E[train_index], E[val_index]
    t_train, t_val = t[train_index], t[val_index]
    # now fit the data on the training set.
    regr = LinearRegression()
    # Fit on training data:
    regr.fit(t_train,E_train)
    # We will first predict the fit:
    Epred_train=regr.predict(t_train) 
    Epred_val=regr.predict(t_val) 

    # The coefficients
    vel[i]= regr.coef_[0][0]
    mse_train[i]= mean_squared_error(E_train, Epred_train)
    mse_val[i]= mean_squared_error(E_val, Epred_val)
    r2s[i]=r2_score(E_val, Epred_val)
    i+=1

# the data shows clearly a trend, so the predictions of the trends are close to each other:
print("mean of the velocity estimates %f4.2 and the standard deviation %f4.2"%(np.mean(vel),np.std(vel)))
# the test error is the average of the mean-square-errors
print("CV = %4.2f"%(np.mean(mse_val)))
mean of the velocity estimates -0.0064404.2 and the standard deviation 0.0000004.2
CV = 0.00

LOOCV is rarely used in practice. This example is just to show the extreme end member of cross-fold validation.

2.6 K-fold cross validation#

Designed to reduce the computational cost of LOOCV. Randomly devide over k groups/folds of approximately equal size. It is typical to use 5 or 10.

Validation Set Approach

from sklearn.model_selection import KFold

# let's try on 10-folds, 10 adjacent split of the data.
k=10
kf = KFold(n_splits=k)

vel = np.zeros(k) # initalize a vector to store the regression values
mse_train = np.zeros(k)
mse_val = np.zeros(k)
r2s = np.zeros(k)
i=0
for train_index, val_index in kf.split(E):    
    E_train, E_val = E[train_index], E[val_index]
    t_train, t_val = t[train_index], t[val_index]
    # now fit the data on the training set.
    regr = LinearRegression()
    # Fit on training data:
    regr.fit(t_train,E_train)
    # We will first predict the fit:
    Epred_train=regr.predict(t_train) 
    Epred_val=regr.predict(t_val) 

    # The coefficients
    vel[i]= regr.coef_[0][0]
    mse_val[i]= mean_squared_error(E_val, Epred_val)
    mse_train[i]= mean_squared_error(E_train, Epred_train)
    r2s[i]=r2_score(E_train, Epred_train)
    i+=1

# the data shows clearly a trend, so the predictions of the trends are close to each other:
print("mean of the velocity estimates %4.2f and the standard deviation %4.2f"%(np.mean(vel),np.std(vel)))
# the test error is the average of the mean-square-errors
print("mean MSE for training set : %4.2f and the validation set: %4.2f"%(np.mean(mse_train),np.mean(mse_val)))
mean of the velocity estimates -0.01 and the standard deviation 0.00
mean MSE for training set : 0.00 and the validation set: 0.00

2. Hyper-parameter tuning#

In classic machine learning, models often have parameters that are learned from the training data (such as weights in a linear regression model), and hyperparameters that are external configuration settings. Hyperparameters are not learned from the data but are set prior to the training process and can significantly impact the model’s performance.

The goal of hyperparameter tuning is to find the optimal combination of hyperparameter values that maximizes the model’s performance on a given dataset. This process helps fine-tune the model to achieve the best possible results and avoid overfitting or underfitting.

Hyper-parameter tuning is now standard and should be performed in every work.

Several approaches exists to hyper-parameter tuning:

  • Manual Tuning:

    • While more time-consuming, manual tuning involves domain experts iteratively adjusting hyperparameters based on their understanding of the problem and the model’s behavior.

    • This approach can be effective when the hyperparameter space is relatively small or when there is substantial domain knowledge.

    • This is a good step for initial exploration and intuition-building experiment, but systematic approaches like described below will be required for robust assessment.

  • Grid Search:

    • In this method, a predefined set of hyperparameter values is specified, and the model is trained and evaluated for all possible combinations.

    • While thorough, grid search can be computationally expensive, especially for a large number of hyperparameters or when the search space is extensive.

    • This method is implemented in the scikit-learn ecosystem as model_selection.GridSearchCV, which uses cross-validation to train-test each model given the tested hyper-parameters.

  • Random Search:

    • Random search involves randomly selecting combinations of hyperparameter values from a predefined distribution of hyperparameters (e.g., uniform or normal).

    • This approach is more computationally efficient than grid search, as it explores a diverse set of hyperparameter combinations.

    • The method is implement in the scikit-learn ecosystem as model_selection.RandomizedSearchCV, which uses cross-validation for each

  • Bayesian Optimization:

    • Bayesian optimization employs probabilistic models to predict the performance of different hyperparameter configurations.

    • It adapts its search based on the results of previous evaluations, allowing it to focus on promising regions of the hyperparameter space.

    Below is a tutorial using the digits data sets.

# basic tools
import numpy as np
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
# dataseta
from sklearn.datasets import load_digits
digits = load_digits()
digits.keys()
dict_keys(['data', 'target', 'frame', 'feature_names', 'target_names', 'images', 'DESCR'])
# explore data type
data,y = digits["data"].copy(),digits["target"].copy()
print(type(data[0][:]),type(y[0]))
# note that we do not modify the raw data that is stored on the digits dictionary.
<class 'numpy.ndarray'> <class 'numpy.int64'>

Plot the data

plt.matshow(digits["images"][1])
<matplotlib.image.AxesImage at 0x302d0e400>
../_images/3.8_robust_training_17_1.png
print(min(data[0]),max(data[0]))
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import train_test_split
scaler = MinMaxScaler()
scaler.fit_transform(data)# fit the model for data normalization
newdata = scaler.transform(data) # transform the data. watch that data was converted to a numpy array

# Split data into 50% train and 50% test subsets
print(f"There are {data.shape[0]} data samples")
X_train, X_test, y_train, y_test = train_test_split(
    data, y, test_size=0.2, shuffle=False)
0.0 15.0
There are 1797 data samples
from sklearn import metrics
from sklearn.neighbors import KNeighborsClassifier

# Support Vector Machine classifier
clf = KNeighborsClassifier(n_neighbors=3)
clf.fit(X_train, y_train) # learn
knn_prediction = clf.predict(X_test) # predict on test
print("SVC Accuracy:", metrics.accuracy_score(y_true=y_test ,y_pred=knn_prediction))
SVC Accuracy: 0.9666666666666667

what are the parameters we are trying to optimize?

clf.get_params()
{'algorithm': 'auto',
 'leaf_size': 30,
 'metric': 'minkowski',
 'metric_params': None,
 'n_jobs': None,
 'n_neighbors': 3,
 'p': 2,
 'weights': 'uniform'}

A search consists of:

  • an estimator (regressor or classifier such as KNN());

  • a parameter space;

  • a method for searching or sampling candidates (grid search or random selection);

  • a cross-validation scheme; and

  • a loss function or a scoring metrics.

1. Grid Search cross validation.#

Performs the search in the brute-force way using cross-validation. One has to define the parameter space. The scikit-learn function is GridSearchCV. More details here.

from sklearn.model_selection import GridSearchCV
param_grid = [
  {'n_neighbors': [1,2,3,4,5,6,7,8,9,10], 'weights': ['uniform','distance'], 'algorithm': [ 'ball_tree', 'kd_tree'],
   'metric':['euclidean','manhattan','chebyshev','minkowski']}
 ]

The algorithm will search for all combinations of parameters, which can be from the model algorithms or the choice of features.

search = GridSearchCV(clf, param_grid, cv=5,verbose=3)
search.fit(X_train, y_train) # learn
Fitting 5 folds for each of 160 candidates, totalling 800 fits
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.1s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.906 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.941 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.969 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.917 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.937 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.937 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.951 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.958 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.962 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.937 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.951 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.958 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.958 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.944 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.917 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.944 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.944 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.920 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.923 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.927 total time=   0.1s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.955 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.958 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.1s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=ball_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=3, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=4, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=5, weights=distance;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=7, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=8, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=9, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=euclidean, n_neighbors=10, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.941 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=3, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.969 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=4, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.986 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=7, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.910 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.930 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.917 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=8, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.917 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=uniform;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.917 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.934 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.906 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.962 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.913 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=manhattan, n_neighbors=10, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.941 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=uniform;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.941 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=1, weights=distance;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.934 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=uniform;, score=0.951 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.934 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=2, weights=distance;, score=0.955 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=uniform;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.951 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=3, weights=distance;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=uniform;, score=0.958 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=4, weights=distance;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.948 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=uniform;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=5, weights=distance;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.941 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=uniform;, score=0.962 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=6, weights=distance;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.941 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=uniform;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.958 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.948 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.927 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=uniform;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.937 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=8, weights=distance;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.920 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=uniform;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.962 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.923 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=9, weights=distance;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.951 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.913 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=uniform;, score=0.965 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.955 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.923 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=chebyshev, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=uniform;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=1, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.948 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.983 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.965 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=2, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.972 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=uniform;, score=0.990 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.990 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.948 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=3, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=uniform;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.941 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.990 total time=   0.1s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.976 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=4, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=5, weights=distance;, score=0.976 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.938 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.965 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.948 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.944 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.965 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=6, weights=distance;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.934 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.951 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=7, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.920 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.976 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.955 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=8, weights=distance;, score=0.986 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.944 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=uniform;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.931 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.972 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.951 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=9, weights=distance;, score=0.983 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.924 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.969 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=uniform;, score=0.979 total time=   0.0s
[CV 1/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.927 total time=   0.0s
[CV 2/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.969 total time=   0.0s
[CV 3/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.944 total time=   0.0s
[CV 4/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.958 total time=   0.0s
[CV 5/5] END algorithm=kd_tree, metric=minkowski, n_neighbors=10, weights=distance;, score=0.983 total time=   0.0s
GridSearchCV(cv=5, estimator=KNeighborsClassifier(n_neighbors=3),
             param_grid=[{'algorithm': ['ball_tree', 'kd_tree'],
                          'metric': ['euclidean', 'manhattan', 'chebyshev',
                                     'minkowski'],
                          'n_neighbors': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
                          'weights': ['uniform', 'distance']}],
             verbose=3)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
search.get_params()
{'cv': 5,
 'error_score': nan,
 'estimator__algorithm': 'auto',
 'estimator__leaf_size': 30,
 'estimator__metric': 'minkowski',
 'estimator__metric_params': None,
 'estimator__n_jobs': None,
 'estimator__n_neighbors': 3,
 'estimator__p': 2,
 'estimator__weights': 'uniform',
 'estimator': KNeighborsClassifier(n_neighbors=3),
 'n_jobs': None,
 'param_grid': [{'n_neighbors': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
   'weights': ['uniform', 'distance'],
   'algorithm': ['ball_tree', 'kd_tree'],
   'metric': ['euclidean', 'manhattan', 'chebyshev', 'minkowski']}],
 'pre_dispatch': '2*n_jobs',
 'refit': True,
 'return_train_score': False,
 'scoring': None,
 'verbose': 3}
search.cv_results_
{'mean_fit_time': array([0.00124884, 0.00117707, 0.00121818, 0.00113316, 0.00122681,
        0.00138907, 0.00210395, 0.00127292, 0.00122337, 0.00123663,
        0.00118914, 0.00121436, 0.00115118, 0.00113239, 0.00118365,
        0.0011848 , 0.00115542, 0.001159  , 0.00118904, 0.00109735,
        0.00116296, 0.00111723, 0.00111051, 0.00110478, 0.00108905,
        0.0012033 , 0.00108781, 0.00119243, 0.00125937, 0.0011766 ,
        0.00124488, 0.00111918, 0.00123463, 0.0011848 , 0.00118871,
        0.0011518 , 0.0012085 , 0.00116782, 0.00117335, 0.00113735,
        0.00139728, 0.00137177, 0.00142455, 0.00137634, 0.00137601,
        0.00138464, 0.0014792 , 0.00140896, 0.00148168, 0.0014122 ,
        0.00141983, 0.00144024, 0.00136123, 0.001334  , 0.00143476,
        0.00139904, 0.00140824, 0.00135183, 0.00139642, 0.00131121,
        0.00117874, 0.00108638, 0.00163503, 0.00152016, 0.00156436,
        0.00124698, 0.00125227, 0.00118723, 0.0012506 , 0.00119944,
        0.00130229, 0.00128765, 0.00117626, 0.0012248 , 0.00115252,
        0.00115037, 0.00110741, 0.00117035, 0.00115166, 0.00119796,
        0.00101123, 0.00118847, 0.00105319, 0.00106664, 0.0010891 ,
        0.00101781, 0.00100174, 0.00100017, 0.00100555, 0.0011692 ,
        0.00098104, 0.00098825, 0.00098815, 0.00101256, 0.00096908,
        0.00107565, 0.00095391, 0.00102201, 0.00104151, 0.00107384,
        0.00105629, 0.00101905, 0.00096607, 0.00106497, 0.0009686 ,
        0.00103407, 0.00094934, 0.00122256, 0.00101881, 0.00099692,
        0.00100203, 0.00101366, 0.00100636, 0.00109873, 0.00099649,
        0.00105796, 0.00098071, 0.00103793, 0.00096726, 0.00101986,
        0.00091963, 0.00101104, 0.00090995, 0.00100121, 0.00096517,
        0.00093355, 0.00095525, 0.00096664, 0.00092306, 0.00104346,
        0.00095744, 0.0010612 , 0.00106163, 0.00103064, 0.00101671,
        0.00094032, 0.000981  , 0.0009903 , 0.00095367, 0.00091543,
        0.00101376, 0.00100813, 0.00100293, 0.00098343, 0.00126929,
        0.00122428, 0.00104017, 0.00108948, 0.00098271, 0.00115924,
        0.00096898, 0.0010004 , 0.00097485, 0.00107174, 0.00103745,
        0.0009757 , 0.00101995, 0.00096178, 0.00110922, 0.00097089]),
 'std_fit_time': array([1.38610255e-04, 1.57720909e-04, 1.00989968e-04, 4.58137042e-05,
        2.08318831e-04, 2.15874319e-04, 1.79652698e-03, 1.23569845e-04,
        5.23360877e-05, 1.04805445e-04, 4.16537387e-05, 1.37098984e-04,
        4.15321059e-05, 7.29187702e-05, 1.02931748e-04, 8.35176709e-05,
        3.92151528e-05, 2.24617128e-05, 6.48329166e-05, 6.29280536e-06,
        1.01134947e-04, 6.78569601e-05, 4.96852878e-05, 4.50668886e-05,
        7.56865403e-06, 6.66134854e-05, 5.71130699e-06, 5.00802978e-05,
        1.81906745e-04, 9.78327624e-05, 1.40097289e-04, 3.73054306e-05,
        1.28742730e-04, 6.14129607e-05, 4.71920414e-05, 6.21684846e-05,
        1.02808060e-04, 8.41447229e-05, 1.27409657e-04, 6.84525608e-05,
        1.15176025e-04, 6.50468806e-05, 7.10498567e-05, 3.91638060e-05,
        8.33066859e-05, 5.79851630e-05, 1.50496289e-04, 9.69671077e-05,
        1.37620904e-04, 7.96638356e-05, 4.76602020e-05, 9.78579060e-05,
        6.26740003e-05, 4.59145903e-05, 9.12182629e-05, 8.43229587e-05,
        1.29647244e-04, 5.86244089e-05, 1.23112946e-04, 1.68652119e-05,
        9.26909376e-05, 6.42333458e-06, 9.73744763e-04, 6.76566095e-04,
        5.78663898e-04, 8.05201915e-05, 1.22415308e-04, 3.07153463e-05,
        1.91509320e-04, 1.04919172e-04, 1.28170369e-04, 4.44982331e-05,
        3.90102924e-05, 1.25173923e-04, 3.76103325e-05, 8.61731954e-05,
        1.81470215e-05, 9.19642488e-05, 2.71012193e-05, 1.14273067e-04,
        2.37566379e-05, 2.28446278e-04, 4.52846201e-05, 1.02751869e-04,
        7.52758100e-05, 6.11211499e-05, 5.54985195e-05, 5.71896550e-05,
        6.58687324e-05, 2.60051047e-04, 2.63645675e-05, 4.98010700e-05,
        2.65061420e-05, 8.12174211e-05, 1.36317061e-05, 1.56459318e-04,
        8.45226649e-06, 9.85183650e-05, 1.36377811e-04, 1.30862917e-04,
        1.14207487e-04, 8.53187340e-05, 1.27477043e-05, 1.13845049e-04,
        1.71936662e-05, 1.15916102e-04, 2.88240075e-06, 1.69710853e-04,
        3.65113437e-05, 6.88199431e-05, 4.58159375e-05, 7.86238686e-05,
        3.70665371e-05, 1.57492362e-04, 3.55301603e-05, 8.40633752e-05,
        2.46288123e-05, 6.62367149e-05, 1.03595548e-05, 9.22611022e-05,
        1.27979043e-05, 1.52837103e-04, 1.61362709e-05, 9.77349383e-05,
        2.65681752e-05, 5.81214619e-05, 5.25384106e-05, 6.89656557e-05,
        1.26775927e-05, 1.61026193e-04, 3.29695570e-05, 8.31905254e-05,
        1.18885497e-04, 6.81662693e-05, 5.56950292e-05, 4.06546674e-05,
        7.84272616e-05, 7.09158980e-05, 7.66697212e-05, 1.93125525e-05,
        4.34372218e-05, 4.41329727e-05, 6.06340400e-05, 3.31096028e-05,
        4.77100476e-04, 2.45075420e-04, 4.13256914e-05, 2.07752625e-04,
        3.53372690e-05, 1.52961928e-04, 1.79137256e-05, 5.77589368e-05,
        2.35092809e-05, 7.86883030e-05, 9.29291397e-05, 3.01810425e-05,
        8.92702329e-05, 7.98897933e-06, 1.47979680e-04, 1.87472821e-05]),
 'mean_score_time': array([0.01868753, 0.01394615, 0.0187542 , 0.01419287, 0.01987119,
        0.01524477, 0.02987127, 0.01495075, 0.01961508, 0.01491618,
        0.0194314 , 0.01478472, 0.01940632, 0.01480479, 0.01988297,
        0.0151772 , 0.01977749, 0.01537185, 0.01980114, 0.01509805,
        0.01816139, 0.01362829, 0.01850786, 0.014008  , 0.01872659,
        0.01439686, 0.01891465, 0.01470308, 0.01964345, 0.01460667,
        0.01968756, 0.01484962, 0.01983299, 0.01503062, 0.01987381,
        0.01514583, 0.02011881, 0.01534662, 0.01985202, 0.01532836,
        0.02945991, 0.02481532, 0.02964506, 0.02499619, 0.0294456 ,
        0.02507963, 0.029776  , 0.02540522, 0.03036761, 0.02532825,
        0.03011165, 0.02545471, 0.02974715, 0.02527542, 0.03087306,
        0.02564602, 0.03003373, 0.02548761, 0.03541689, 0.02555256,
        0.01842122, 0.01388159, 0.02007003, 0.01518793, 0.03459001,
        0.01778674, 0.02142529, 0.0148006 , 0.02153287, 0.01488466,
        0.01997147, 0.01510315, 0.01993761, 0.01516848, 0.01962051,
        0.01500196, 0.01955028, 0.01519032, 0.02022538, 0.01529517,
        0.02173471, 0.01776476, 0.02317843, 0.01817288, 0.02327147,
        0.01833138, 0.02323136, 0.01853814, 0.02324533, 0.01887345,
        0.02347903, 0.01888738, 0.02364883, 0.01913295, 0.02398396,
        0.0192328 , 0.02399478, 0.01933594, 0.02404227, 0.0195086 ,
        0.02267909, 0.01784525, 0.02259917, 0.01822462, 0.02277431,
        0.0183197 , 0.0230062 , 0.01887469, 0.02331243, 0.01834903,
        0.02341313, 0.01881456, 0.02337708, 0.01889338, 0.02366481,
        0.01902056, 0.02361536, 0.01894231, 0.02342539, 0.0188375 ,
        0.02002616, 0.01545787, 0.02174444, 0.01753159, 0.02303805,
        0.01833916, 0.023704  , 0.01899157, 0.02425828, 0.0196559 ,
        0.02757277, 0.02112141, 0.02564435, 0.02097869, 0.0258975 ,
        0.02097244, 0.02619228, 0.02142987, 0.02606578, 0.02166505,
        0.02175441, 0.0172081 , 0.02244763, 0.01786895, 0.0260797 ,
        0.02090788, 0.02622542, 0.02605867, 0.02366095, 0.02039466,
        0.02343097, 0.0190908 , 0.02358098, 0.0193728 , 0.02411323,
        0.01921644, 0.02392311, 0.01929154, 0.02717447, 0.01947417]),
 'std_score_time': array([5.03782991e-04, 1.31437600e-04, 1.74985425e-04, 9.34483316e-05,
        1.82275943e-03, 5.61656273e-04, 1.89303900e-02, 1.00879744e-04,
        1.92472151e-04, 1.13918084e-04, 2.04068221e-04, 1.92324842e-04,
        1.47850049e-04, 6.44204001e-05, 4.37557749e-04, 1.58690260e-04,
        1.61405638e-04, 8.13426612e-05, 1.33894566e-04, 3.58476903e-05,
        1.04412378e-04, 1.14769405e-04, 7.06335069e-05, 9.25055976e-05,
        1.79986921e-04, 1.96850450e-04, 1.74163387e-04, 1.56066799e-04,
        5.87086541e-04, 1.27648996e-04, 4.20547196e-04, 2.40419546e-04,
        3.97463968e-04, 1.30119853e-04, 8.32554674e-05, 9.99184247e-05,
        2.99580382e-04, 2.20103384e-04, 1.28090709e-04, 7.54357891e-05,
        4.25250355e-04, 1.41278489e-04, 2.68953091e-04, 2.40666219e-04,
        1.83980275e-04, 1.99071696e-04, 3.58846958e-04, 2.11625327e-04,
        2.87521063e-04, 1.76445252e-04, 1.52451007e-04, 1.55543769e-04,
        6.59582489e-05, 4.58404966e-05, 5.45482869e-04, 1.16547176e-04,
        1.43804205e-04, 6.22023792e-05, 1.05659833e-02, 4.19948187e-05,
        1.51095949e-04, 3.89151710e-05, 1.73679759e-03, 9.11807228e-04,
        2.63339627e-02, 2.51871815e-03, 1.82379973e-03, 1.18121852e-04,
        3.63327082e-03, 1.85471843e-04, 2.97016986e-04, 9.42553289e-05,
        2.44834712e-04, 6.15933346e-05, 2.18382444e-04, 9.66214229e-05,
        4.52101981e-05, 1.41841735e-04, 1.00845997e-04, 1.13065179e-04,
        1.74605306e-04, 2.70219095e-04, 3.80271538e-04, 2.01323483e-04,
        2.68948526e-04, 2.60225403e-04, 1.66381228e-04, 1.43209894e-04,
        2.88827474e-04, 1.48091451e-04, 2.49118707e-04, 1.04051353e-04,
        2.38234048e-04, 5.42318369e-05, 5.06056383e-04, 1.26323370e-04,
        5.59945605e-04, 9.05619596e-05, 2.06687573e-04, 1.84920315e-04,
        1.20859643e-04, 6.70027644e-05, 1.89654114e-04, 1.91841053e-04,
        2.09914668e-04, 9.99266393e-05, 2.00604095e-04, 1.87628205e-04,
        3.40702389e-04, 3.27586087e-05, 3.24157758e-04, 1.63133999e-04,
        3.47303702e-04, 2.90929937e-04, 2.67093072e-04, 7.66858822e-05,
        3.75390942e-04, 1.46053392e-04, 1.48127635e-04, 1.25412705e-05,
        4.29569874e-04, 5.69374157e-04, 4.76405077e-04, 5.53443085e-04,
        7.04715499e-04, 5.51630154e-04, 5.25904551e-04, 4.97251052e-04,
        5.07950178e-04, 5.76934289e-04, 4.08719804e-03, 1.46238520e-03,
        4.32515259e-04, 4.69585374e-04, 5.72773115e-04, 5.38288317e-04,
        3.84504572e-04, 4.92084918e-04, 4.99882528e-04, 3.91507624e-04,
        2.74232428e-04, 4.37413982e-04, 2.69062878e-04, 2.30077333e-04,
        6.05872932e-03, 3.12848634e-03, 5.61351310e-03, 1.49289819e-02,
        5.44116923e-04, 1.53893981e-03, 1.14436706e-04, 3.79583332e-04,
        1.66641959e-04, 2.22551789e-04, 5.80779465e-04, 1.93461456e-04,
        2.60392446e-04, 1.11335973e-04, 4.32878410e-03, 1.45710123e-04]),
 'param_algorithm': masked_array(data=['ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'ball_tree', 'ball_tree', 'ball_tree', 'ball_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree',
                    'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree', 'kd_tree'],
              mask=[False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_metric': masked_array(data=['euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'euclidean', 'euclidean', 'euclidean', 'euclidean',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'manhattan', 'manhattan', 'manhattan', 'manhattan',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'chebyshev', 'chebyshev', 'chebyshev', 'chebyshev',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski',
                    'minkowski', 'minkowski', 'minkowski', 'minkowski'],
              mask=[False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_n_neighbors': masked_array(data=[1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9,
                    10, 10, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8,
                    9, 9, 10, 10, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7,
                    8, 8, 9, 9, 10, 10, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6,
                    7, 7, 8, 8, 9, 9, 10, 10, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5,
                    6, 6, 7, 7, 8, 8, 9, 9, 10, 10, 1, 1, 2, 2, 3, 3, 4, 4,
                    5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 10, 10, 1, 1, 2, 2, 3, 3,
                    4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 10, 10, 1, 1, 2, 2,
                    3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 10, 10],
              mask=[False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_weights': masked_array(data=['uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance',
                    'uniform', 'distance', 'uniform', 'distance'],
              mask=[False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False,
                    False, False, False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'params': [{'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'euclidean',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'manhattan',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'chebyshev',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'ball_tree',
   'metric': 'minkowski',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'euclidean',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'manhattan',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'chebyshev',
   'n_neighbors': 10,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 1,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 1,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 2,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 2,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 3,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 3,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 4,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 4,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 5,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 5,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 6,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 6,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 7,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 7,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 8,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 8,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 9,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 9,
   'weights': 'distance'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 10,
   'weights': 'uniform'},
  {'algorithm': 'kd_tree',
   'metric': 'minkowski',
   'n_neighbors': 10,
   'weights': 'distance'}],
 'split0_test_score': array([0.94791667, 0.94791667, 0.9375    , 0.94791667, 0.94444444,
        0.94444444, 0.93055556, 0.94097222, 0.9375    , 0.9375    ,
        0.9375    , 0.94444444, 0.93402778, 0.93402778, 0.92013889,
        0.93055556, 0.93055556, 0.93055556, 0.92361111, 0.92708333,
        0.92361111, 0.92361111, 0.90625   , 0.92361111, 0.93402778,
        0.93402778, 0.91319444, 0.92361111, 0.92013889, 0.92013889,
        0.90972222, 0.92013889, 0.90972222, 0.90972222, 0.90972222,
        0.91666667, 0.91319444, 0.91319444, 0.90972222, 0.91319444,
        0.92361111, 0.92361111, 0.91319444, 0.93402778, 0.92013889,
        0.93055556, 0.92708333, 0.92708333, 0.93055556, 0.93055556,
        0.92708333, 0.92708333, 0.92361111, 0.92361111, 0.91666667,
        0.92361111, 0.92708333, 0.92361111, 0.92013889, 0.92361111,
        0.94791667, 0.94791667, 0.9375    , 0.94791667, 0.94444444,
        0.94444444, 0.93055556, 0.94097222, 0.9375    , 0.9375    ,
        0.9375    , 0.94444444, 0.93402778, 0.93402778, 0.92013889,
        0.93055556, 0.93055556, 0.93055556, 0.92361111, 0.92708333,
        0.94791667, 0.94791667, 0.9375    , 0.94791667, 0.94444444,
        0.94444444, 0.93055556, 0.94097222, 0.9375    , 0.9375    ,
        0.9375    , 0.94444444, 0.93402778, 0.93402778, 0.92013889,
        0.93055556, 0.93055556, 0.93055556, 0.92361111, 0.92708333,
        0.92361111, 0.92361111, 0.90972222, 0.92361111, 0.93402778,
        0.93402778, 0.91319444, 0.92361111, 0.92361111, 0.92361111,
        0.90972222, 0.92013889, 0.90972222, 0.90972222, 0.90972222,
        0.91666667, 0.91666667, 0.91666667, 0.90625   , 0.91319444,
        0.94097222, 0.94097222, 0.92361111, 0.93402778, 0.94444444,
        0.95138889, 0.93055556, 0.93402778, 0.93402778, 0.93402778,
        0.92361111, 0.92708333, 0.92708333, 0.92708333, 0.93402778,
        0.9375    , 0.92361111, 0.93402778, 0.94097222, 0.9375    ,
        0.94791667, 0.94791667, 0.9375    , 0.94791667, 0.94444444,
        0.94444444, 0.93055556, 0.94097222, 0.9375    , 0.9375    ,
        0.9375    , 0.94444444, 0.93402778, 0.93402778, 0.92013889,
        0.93055556, 0.93055556, 0.93055556, 0.92361111, 0.92708333]),
 'split1_test_score': array([0.98263889, 0.98263889, 0.98958333, 0.98263889, 0.98958333,
        0.98958333, 0.97569444, 0.98958333, 0.97569444, 0.97916667,
        0.96527778, 0.97916667, 0.96875   , 0.97222222, 0.96875   ,
        0.97569444, 0.96875   , 0.97222222, 0.96875   , 0.96875   ,
        0.98611111, 0.98611111, 0.97222222, 0.98958333, 0.98958333,
        0.98958333, 0.97222222, 0.98958333, 0.97222222, 0.98611111,
        0.96527778, 0.98263889, 0.96180556, 0.97222222, 0.96527778,
        0.96875   , 0.96527778, 0.96875   , 0.96527778, 0.96527778,
        0.96527778, 0.96527778, 0.95486111, 0.95833333, 0.96180556,
        0.96527778, 0.96180556, 0.96180556, 0.95138889, 0.96527778,
        0.96180556, 0.96180556, 0.94444444, 0.95486111, 0.94444444,
        0.94444444, 0.94444444, 0.95138889, 0.95138889, 0.95833333,
        0.98263889, 0.98263889, 0.98958333, 0.98263889, 0.98958333,
        0.98958333, 0.97569444, 0.98958333, 0.97569444, 0.97916667,
        0.96527778, 0.97916667, 0.96875   , 0.97222222, 0.96875   ,
        0.97569444, 0.96875   , 0.97222222, 0.96875   , 0.96875   ,
        0.98263889, 0.98263889, 0.98958333, 0.98263889, 0.98958333,
        0.98958333, 0.97569444, 0.98958333, 0.97569444, 0.97916667,
        0.96527778, 0.97916667, 0.96875   , 0.97222222, 0.96875   ,
        0.97569444, 0.96875   , 0.97222222, 0.96875   , 0.96875   ,
        0.98611111, 0.98611111, 0.97222222, 0.98958333, 0.98958333,
        0.98958333, 0.97222222, 0.98958333, 0.97222222, 0.98611111,
        0.96875   , 0.98263889, 0.96180556, 0.97222222, 0.96527778,
        0.96875   , 0.96527778, 0.96875   , 0.96527778, 0.96527778,
        0.96875   , 0.96875   , 0.96527778, 0.96527778, 0.96875   ,
        0.96875   , 0.96527778, 0.96527778, 0.94791667, 0.95833333,
        0.95486111, 0.95486111, 0.95138889, 0.95833333, 0.94791667,
        0.95486111, 0.95138889, 0.96180556, 0.95138889, 0.95486111,
        0.98263889, 0.98263889, 0.98958333, 0.98263889, 0.98958333,
        0.98958333, 0.97569444, 0.98958333, 0.97569444, 0.97916667,
        0.96527778, 0.97916667, 0.96875   , 0.97222222, 0.96875   ,
        0.97569444, 0.96875   , 0.97222222, 0.96875   , 0.96875   ]),
 'split2_test_score': array([0.9825784 , 0.9825784 , 0.96515679, 0.9825784 , 0.97212544,
        0.97560976, 0.95121951, 0.97560976, 0.94425087, 0.96515679,
        0.94773519, 0.96515679, 0.94425087, 0.95121951, 0.94425087,
        0.94425087, 0.94425087, 0.94425087, 0.94425087, 0.94425087,
        0.95818815, 0.95818815, 0.95818815, 0.95818815, 0.96515679,
        0.97212544, 0.94076655, 0.96864111, 0.93379791, 0.94773519,
        0.93379791, 0.94076655, 0.93031359, 0.93031359, 0.93031359,
        0.93379791, 0.93379791, 0.93379791, 0.93728223, 0.93728223,
        0.96515679, 0.96515679, 0.94425087, 0.96167247, 0.94773519,
        0.95818815, 0.95818815, 0.95818815, 0.94773519, 0.95818815,
        0.94425087, 0.95121951, 0.93379791, 0.94773519, 0.93031359,
        0.93379791, 0.91986063, 0.92334495, 0.92682927, 0.93031359,
        0.9825784 , 0.9825784 , 0.96515679, 0.9825784 , 0.97212544,
        0.97560976, 0.95121951, 0.97560976, 0.94425087, 0.96515679,
        0.94773519, 0.96515679, 0.94425087, 0.95121951, 0.94425087,
        0.94425087, 0.94425087, 0.94425087, 0.94425087, 0.94425087,
        0.9825784 , 0.9825784 , 0.96515679, 0.9825784 , 0.97212544,
        0.97560976, 0.95121951, 0.97560976, 0.94425087, 0.96515679,
        0.94773519, 0.96515679, 0.94425087, 0.95121951, 0.94425087,
        0.94425087, 0.94425087, 0.94425087, 0.94425087, 0.94425087,
        0.95818815, 0.95818815, 0.95470383, 0.95818815, 0.96515679,
        0.97212544, 0.94076655, 0.96864111, 0.93379791, 0.94773519,
        0.93379791, 0.94076655, 0.93031359, 0.93031359, 0.93031359,
        0.93379791, 0.93379791, 0.93379791, 0.93728223, 0.93728223,
        0.96515679, 0.96515679, 0.94076655, 0.95818815, 0.93728223,
        0.94773519, 0.95818815, 0.95818815, 0.94773519, 0.95818815,
        0.94076655, 0.94773519, 0.93728223, 0.95121951, 0.92682927,
        0.93728223, 0.91986063, 0.92334495, 0.91289199, 0.92334495,
        0.9825784 , 0.9825784 , 0.96515679, 0.9825784 , 0.97212544,
        0.97560976, 0.95121951, 0.97560976, 0.94425087, 0.96515679,
        0.94773519, 0.96515679, 0.94425087, 0.95121951, 0.94425087,
        0.94425087, 0.94425087, 0.94425087, 0.94425087, 0.94425087]),
 'split3_test_score': array([0.96515679, 0.96515679, 0.95470383, 0.96515679, 0.94773519,
        0.94773519, 0.95470383, 0.95818815, 0.95470383, 0.95470383,
        0.95470383, 0.95121951, 0.95470383, 0.95470383, 0.95121951,
        0.95470383, 0.94425087, 0.95121951, 0.96864111, 0.95818815,
        0.96167247, 0.96167247, 0.94076655, 0.96167247, 0.95818815,
        0.95818815, 0.95121951, 0.96167247, 0.95470383, 0.95818815,
        0.95818815, 0.95818815, 0.96515679, 0.96515679, 0.96167247,
        0.96167247, 0.96515679, 0.96515679, 0.96167247, 0.96515679,
        0.93728223, 0.93728223, 0.94425087, 0.93728223, 0.94773519,
        0.94425087, 0.94773519, 0.95121951, 0.95818815, 0.95818815,
        0.94773519, 0.95818815, 0.94773519, 0.95121951, 0.95121951,
        0.95121951, 0.94773519, 0.95121951, 0.94425087, 0.95121951,
        0.96515679, 0.96515679, 0.95470383, 0.96515679, 0.94773519,
        0.94773519, 0.95470383, 0.95818815, 0.95470383, 0.95470383,
        0.95470383, 0.95121951, 0.95470383, 0.95470383, 0.95121951,
        0.95470383, 0.94425087, 0.95121951, 0.96864111, 0.95818815,
        0.96515679, 0.96515679, 0.95470383, 0.96515679, 0.94773519,
        0.94773519, 0.95470383, 0.95818815, 0.95470383, 0.95470383,
        0.95470383, 0.95121951, 0.95470383, 0.95470383, 0.95121951,
        0.95470383, 0.94425087, 0.95121951, 0.96864111, 0.95818815,
        0.96167247, 0.96167247, 0.94076655, 0.96167247, 0.95818815,
        0.95818815, 0.95470383, 0.96167247, 0.95470383, 0.95818815,
        0.95818815, 0.95818815, 0.96864111, 0.96515679, 0.95818815,
        0.95818815, 0.96515679, 0.96515679, 0.96167247, 0.96864111,
        0.94076655, 0.94076655, 0.93379791, 0.93379791, 0.95818815,
        0.95818815, 0.95470383, 0.95818815, 0.95470383, 0.95818815,
        0.94425087, 0.95470383, 0.94076655, 0.94425087, 0.95818815,
        0.95470383, 0.95121951, 0.95121951, 0.94425087, 0.95121951,
        0.96515679, 0.96515679, 0.95470383, 0.96515679, 0.94773519,
        0.94773519, 0.95470383, 0.95818815, 0.95470383, 0.95470383,
        0.95470383, 0.95121951, 0.95470383, 0.95470383, 0.95121951,
        0.95470383, 0.94425087, 0.95121951, 0.96864111, 0.95818815]),
 'split4_test_score': array([0.98606272, 0.98606272, 0.9825784 , 0.98606272, 0.98954704,
        0.98606272, 0.97560976, 0.9825784 , 0.9825784 , 0.97560976,
        0.97909408, 0.97909408, 0.9825784 , 0.9825784 , 0.9825784 ,
        0.98606272, 0.9825784 , 0.9825784 , 0.97909408, 0.9825784 ,
        0.98606272, 0.98606272, 0.9825784 , 0.98606272, 0.9825784 ,
        0.97909408, 0.97212544, 0.98606272, 0.97909408, 0.97909408,
        0.97560976, 0.97909408, 0.97560976, 0.97909408, 0.97212544,
        0.97909408, 0.97212544, 0.97212544, 0.97212544, 0.97909408,
        0.96515679, 0.96515679, 0.95121951, 0.95121951, 0.96167247,
        0.95818815, 0.95470383, 0.96167247, 0.96167247, 0.96515679,
        0.96167247, 0.95818815, 0.96167247, 0.97212544, 0.96167247,
        0.97212544, 0.96167247, 0.96515679, 0.95470383, 0.96167247,
        0.98606272, 0.98606272, 0.9825784 , 0.98606272, 0.98954704,
        0.98606272, 0.97560976, 0.9825784 , 0.9825784 , 0.97560976,
        0.97909408, 0.97909408, 0.9825784 , 0.9825784 , 0.9825784 ,
        0.98606272, 0.9825784 , 0.9825784 , 0.97909408, 0.9825784 ,
        0.98606272, 0.98606272, 0.9825784 , 0.98606272, 0.98954704,
        0.98606272, 0.97560976, 0.9825784 , 0.9825784 , 0.97560976,
        0.97909408, 0.97909408, 0.9825784 , 0.9825784 , 0.9825784 ,
        0.98606272, 0.9825784 , 0.9825784 , 0.97909408, 0.9825784 ,
        0.9825784 , 0.9825784 , 0.9825784 , 0.98606272, 0.97909408,
        0.97909408, 0.97212544, 0.98606272, 0.97560976, 0.97909408,
        0.97560976, 0.97909408, 0.97560976, 0.97909408, 0.97212544,
        0.97909408, 0.97212544, 0.97212544, 0.96864111, 0.97909408,
        0.96864111, 0.96864111, 0.95121951, 0.95470383, 0.96515679,
        0.96167247, 0.95818815, 0.96515679, 0.96864111, 0.96864111,
        0.96167247, 0.96864111, 0.96515679, 0.97212544, 0.96515679,
        0.97560976, 0.96864111, 0.96864111, 0.96515679, 0.96864111,
        0.98606272, 0.98606272, 0.9825784 , 0.98606272, 0.98954704,
        0.98606272, 0.97560976, 0.9825784 , 0.9825784 , 0.97560976,
        0.97909408, 0.97909408, 0.9825784 , 0.9825784 , 0.9825784 ,
        0.98606272, 0.9825784 , 0.9825784 , 0.97909408, 0.9825784 ]),
 'mean_test_score': array([0.97287069, 0.97287069, 0.96590447, 0.97287069, 0.96868709,
        0.96868709, 0.95755662, 0.96938637, 0.95894551, 0.96242741,
        0.95686218, 0.9638163 , 0.95686218, 0.95895035, 0.95338753,
        0.95825348, 0.95407714, 0.95616531, 0.95686943, 0.95617015,
        0.96312911, 0.96312911, 0.95200106, 0.96382356, 0.96590689,
        0.96660376, 0.94990563, 0.96591415, 0.95199139, 0.95825348,
        0.94851916, 0.95616531, 0.94852158, 0.95130178, 0.9478223 ,
        0.95199623, 0.94991047, 0.95060492, 0.94921603, 0.95200106,
        0.95129694, 0.95129694, 0.94155536, 0.94850707, 0.94781746,
        0.9512921 , 0.94990321, 0.95199381, 0.94990805, 0.95547329,
        0.94850949, 0.95129694, 0.94225223, 0.94991047, 0.94086334,
        0.94503968, 0.94015921, 0.94294425, 0.93946235, 0.94503   ,
        0.97287069, 0.97287069, 0.96590447, 0.97287069, 0.96868709,
        0.96868709, 0.95755662, 0.96938637, 0.95894551, 0.96242741,
        0.95686218, 0.9638163 , 0.95686218, 0.95895035, 0.95338753,
        0.95825348, 0.95407714, 0.95616531, 0.95686943, 0.95617015,
        0.97287069, 0.97287069, 0.96590447, 0.97287069, 0.96868709,
        0.96868709, 0.95755662, 0.96938637, 0.95894551, 0.96242741,
        0.95686218, 0.9638163 , 0.95686218, 0.95895035, 0.95338753,
        0.95825348, 0.95407714, 0.95616531, 0.95686943, 0.95617015,
        0.96243225, 0.96243225, 0.95199864, 0.96382356, 0.96521003,
        0.96660376, 0.9506025 , 0.96591415, 0.95198897, 0.95894793,
        0.94921361, 0.95616531, 0.94921845, 0.95130178, 0.94712544,
        0.95129936, 0.95060492, 0.95129936, 0.94782472, 0.95269793,
        0.95685734, 0.95685734, 0.94293457, 0.94919909, 0.95476432,
        0.95754694, 0.95338269, 0.95616773, 0.95060492, 0.95547571,
        0.94503242, 0.95060492, 0.94433556, 0.9506025 , 0.94642373,
        0.95199139, 0.94294425, 0.94780778, 0.94293215, 0.94711334,
        0.97287069, 0.97287069, 0.96590447, 0.97287069, 0.96868709,
        0.96868709, 0.95755662, 0.96938637, 0.95894551, 0.96242741,
        0.95686218, 0.9638163 , 0.95686218, 0.95895035, 0.95338753,
        0.95825348, 0.95407714, 0.95616531, 0.95686943, 0.95617015]),
 'std_test_score': array([0.01446302, 0.01446302, 0.01882305, 0.01446302, 0.01954636,
        0.01904299, 0.01692484, 0.01762635, 0.01750928, 0.01511592,
        0.0143383 , 0.01417596, 0.01724968, 0.01693139, 0.02136241,
        0.02026519, 0.01883609, 0.01885603, 0.02019405, 0.01922411,
        0.02298839, 0.02298839, 0.02683849, 0.02371317, 0.01957004,
        0.01921757, 0.0220214 , 0.02358461, 0.02234629, 0.02355308,
        0.02379142, 0.02351891, 0.02459221, 0.02672602, 0.02385512,
        0.02318607, 0.02267064, 0.02317823, 0.02297895, 0.02368955,
        0.01756458, 0.01756458, 0.01476016, 0.01107147, 0.01519043,
        0.01241204, 0.01231908, 0.01303417, 0.010848  , 0.01284938,
        0.01286752, 0.01258343, 0.01289171, 0.01559561, 0.01581057,
        0.01647874, 0.01497725, 0.01667988, 0.0136416 , 0.01528138,
        0.01446302, 0.01446302, 0.01882305, 0.01446302, 0.01954636,
        0.01904299, 0.01692484, 0.01762635, 0.01750928, 0.01511592,
        0.0143383 , 0.01417596, 0.01724968, 0.01693139, 0.02136241,
        0.02026519, 0.01883609, 0.01885603, 0.02019405, 0.01922411,
        0.01446302, 0.01446302, 0.01882305, 0.01446302, 0.01954636,
        0.01904299, 0.01692484, 0.01762635, 0.01750928, 0.01511592,
        0.0143383 , 0.01417596, 0.01724968, 0.01693139, 0.02136241,
        0.02026519, 0.01883609, 0.01885603, 0.02019405, 0.01922411,
        0.02232589, 0.02232589, 0.02555343, 0.02371317, 0.01901824,
        0.01921757, 0.02210692, 0.02358461, 0.02054709, 0.02244415,
        0.02431536, 0.02351891, 0.0250979 , 0.02672602, 0.02348842,
        0.02293578, 0.02156137, 0.02207262, 0.0235342 , 0.02411374,
        0.01311811, 0.01311811, 0.01434876, 0.01293827, 0.01206191,
        0.00744489, 0.01192058, 0.01150772, 0.01124606, 0.0114562 ,
        0.01304742, 0.01357734, 0.01298769, 0.01494575, 0.01434685,
        0.01414058, 0.01847563, 0.01691109, 0.0171665 , 0.01547699,
        0.01446302, 0.01446302, 0.01882305, 0.01446302, 0.01954636,
        0.01904299, 0.01692484, 0.01762635, 0.01750928, 0.01511592,
        0.0143383 , 0.01417596, 0.01724968, 0.01693139, 0.02136241,
        0.02026519, 0.01883609, 0.01885603, 0.02019405, 0.01922411]),
 'rank_test_score': array([  1,   1,  30,   1,  17,  17,  63,  13,  54,  45,  72,  37,  72,
         49, 100,  58,  96,  87,  68,  82,  41,  41, 106,  35,  29,  25,
        131,  27, 111,  58, 138,  87, 137, 114, 142, 109, 129, 123, 134,
        107, 118, 118, 157, 140, 143, 121, 132, 110, 130,  94, 139, 118,
        156, 128, 158, 148, 159, 152, 160, 150,   1,   1,  30,   1,  17,
         17,  63,  13,  54,  45,  72,  37,  72,  49, 100,  58,  96,  87,
         68,  82,   1,   1,  30,   1,  17,  17,  63,  13,  54,  45,  72,
         37,  72,  49, 100,  58,  96,  87,  68,  82,  43,  43, 108,  35,
         34,  25, 126,  27, 113,  53, 135,  87, 133, 114, 145, 116, 123,
        116, 141, 105,  80,  80, 154, 136,  95,  67, 104,  86, 123,  93,
        149, 122, 151, 126, 147, 111, 152, 144, 155, 146,   1,   1,  30,
          1,  17,  17,  63,  13,  54,  45,  72,  37,  72,  49, 100,  58,
         96,  87,  68,  82], dtype=int32)}
search.best_params_
{'algorithm': 'ball_tree',
 'metric': 'euclidean',
 'n_neighbors': 1,
 'weights': 'uniform'}

2. Random Search Cross Validation.#

It performs the search in the brute-force way using cross-validation. One has to define the parameter space. The scikit-learn function is GridSearchCV. More details here.

The advantage is that it can be used for a wide hyperparameter space and limit to n_iter number of iterations.

from sklearn.model_selection import RandomizedSearchCV
from scipy.stats import  randint


distributions= [ {'n_neighbors': randint.rvs(low=1,high=10,size=10), 'weights': ['uniform','distance'], 'algorithm': [ 'ball_tree', 'kd_tree'],
                  'metric':['euclidean','manhattan','chebyshev','minkowski']} ]
clf2 = RandomizedSearchCV(clf, distributions, random_state=0,cv=5,n_iter=100)
clf2.fit(X_train,y_train)
RandomizedSearchCV(cv=5, estimator=KNeighborsClassifier(n_neighbors=3),
                   n_iter=100,
                   param_distributions=[{'algorithm': ['ball_tree', 'kd_tree'],
                                         'metric': ['euclidean', 'manhattan',
                                                    'chebyshev', 'minkowski'],
                                         'n_neighbors': array([2, 1, 4, 6, 9, 6, 3, 9, 8, 9]),
                                         'weights': ['uniform', 'distance']}],
                   random_state=0)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
print(search.best_params_)
print(clf2.best_params_)
{'algorithm': 'ball_tree', 'metric': 'euclidean', 'n_neighbors': 1, 'weights': 'uniform'}
{'weights': 'distance', 'n_neighbors': 1, 'metric': 'minkowski', 'algorithm': 'kd_tree'}